Exfoliate
Geek Trainee
I realize ATI and nVidia's chipsets aren't identical but I was just sort of wondering why nVidia can get more out of say, a 350MHz chip (like the 6800GT) than ATI? If the Radeon X800XT ran @ 350MHz and so did the 6800 Ultra for example, the Ultra would trounce the XT. It must be the architecture I guess as otherwise, aside from a few less transistors on the XT's part, the specs would be pretty similar? So how does it work (more or less)? When it comes to processors I understand that intel has really high clockspeeds yet can be destroyed (in some things) by AMD's and Apple's chips that have far lower clockspeeds, as the fabrication process and architecture is quite different (so nVidia is like AMD...), but the video card thing isn't so obvious to me. Hope you don't mind clearing this up, thanks.