ATI first used DD2 in the 9800 Pro (256mb version) and now you can buy it for a reasonable price but is it worth it? I've read that although it transfers data about 33% faster it actually accesses data a good deal slower, go figure. What do you think, is it superior to DDR (in other words have they worked on the slow accessing times), or do you have to end up buying DD2 PC4300 just to outdo DDR <PC3700 for example? Thanks everyone.
DDR2's big thing is higher latency. IF memory speeds would go up rapidly in the base frequencies (like go up to say PC6000)---meaning that becomes a standard, then DDR2 will be necessary as DDR1 supposedly has a limit on how high it can go due to the design. Problem is, we really aren't hitting the speeds fast enough to negate the ill-effects of the higher latency on DDR2. DDR2 is only supported by the Intel 915 and 925X chipsets, although I think SiS has one or two out that also support DDR2. DDR2 is a 240-pin module vs the 184-pin on DDR. It's not compatible, in otherwords, nor is it available for the AMD side of things. I'm not sure if there's some confusion, but I can't recall ATi using DDR2. I know they've been using DDR since the early Radeon days, but I have yet to see any video card use DDR2. I know there's a big move toward GDDR3 on video cards, but not to DDR2. If you gotta reputable link, feel free to show it.
Its probably not worth it, but of course if benchmarks prove that its beneficial for the price then I guess so. Anyway I don't know whether DDR2 is really a good option, compared to GDDR3 and XDR its not as good for graphics.
Here ya go Big B: http://hardware.gamespot.com/Story-ST-536-x-9-9-x Besides having twice the memory, the 9800 Pro 256MB is notable for its use of DDR2 SDRAM... Apparently the 9250 also ended up utilizing this technology I guess:http://www.pricerunner.co.uk/computing/components/graphics-cards/224325/details I first read this in "Computer Games" magazine. It was awfully short lived because the XT became the card of choice and then GDDR3 came out last summer and changed it all. Speaking of which, what does the "G" stand for that precedes theDoubleDataRate3?
So it's tweaked or designed for graphics cards then I guess? I'm assuming we won't be putting sticks of GDDR3 in our mobo's but I could be wrong.
GDDR3 has a different IO and a higher clock speed in general than DDR2. Not sure about the intricate differences but I guess it would be optimised for graphic loads.
Thanks dude. I knew more or less why GDDR3 was superior to DDR2 but I was unsure about it ever being used in PC's (they'd probably just call I DDR3 I guess).
No confusion but you pay for it in extra dough. You buy a gig of Apples ram but it costs up to $350 (for the ol' Mac Mini). Macs are particularly good for new users who want an incredably easy, user friendly experience. Grandma wants to write e-mails and web browse without any hitches. Get her an eMac. We all know that PC's have a much, much larger software and hardware base available and therefore are often more affordable (generally speaking) and more popular with the gaming crowd for example.