Today's PC is far more multimedia oriented. High Definition graphics is "just coming" in television; it's actually exceeded by what top end PC graphics systems can manage - that's why a high-res flat screen for the PC is twice the price of the same size screen for TV and video. In the future, actual software will need to be written to switch, seamlessly and in real time, between the background business applications, and the graphics and video work done in the foreground - and the business applications may want to take direct advantage of the GPU.
And the GPU isn't just for drawing pictures. Talk to any crypto expert and you'll find they are all trying to find ways of harnessing that extraordinary power. To quote Wikipedia: "Recent developments in GPUs include support for programmable shaders which can manipulate vertices and textures with many of the same operations supported by CPUs, oversampling and interpolation techniques to reduce aliasing, and very high-precision colour spaces. Because most of these computations involve matrix and vector operations, engineers and scientists have increasingly studied the use of GPUs for non-graphical calculations."
And: "Because all these applications exceed an actual GPU's usage target, a new term, GPGPU is usually employed to describe them. While GPGPUs are the same chips as GPUs, there is increased pressure on manufacturers from "GPGPU users" to improve hardware design, usually focusing on adding more flexibility to the programming model."
The black hole of the processor has, at last, started to attract the GPU and the GPGPU. AMD feels that it has to move, now, before it becomes part of Intel, rather than part of a generic processor platform.
Bookmarks