The games are trying to incorporate things like realistic bounces, ray-tracing for reflections, depth of focus in viewpoints, by incorporating F=ma and so forth in the underlying engine. Hence "Fermi" for nVidia's latest.
http://en.wikipedia....ki/Physics_engine
The graphics hardware is continuing to get more complicated, so they're starting to hit a wall. Fermi was late and heat can be a problem with them, apparently. Of course, MS's DirectX drivers continue become more and more complicated and to be late as well, so there's the usual issue of the software and hardware rarely being in sync.
Unfortunately, one can't simply take the winner as being the card with the most RAM, most transistors, highest clock rate. It's complicated and depends on what you want to do with the card. E.g. http://www.maximumpc..._and_power_hungry - a 250W graphics chip...
I think it's nuts to spend more on a graphics card than on a CPU, but I'm not a gamer. :-)
HTH.
Cheers,
Scott.