Hi,everyone! I've recently started to use coolbits to OC my video card. Used the automatic clock detection. Anyways, I haven't yet run any tests (noly the 3Dmark2001 SE ) and since I saw in a forum that "You should NEVER use the automatic clock detection feature", I got, naturally concerned. Sure, I have to run proper tests before leaving the OCed frequency for good, but is it really so dangerous (the automatic detection, I mean)? --Any experiences, info --
Seeing you have a 5500 series graphics card, using the automatic clock detection feature might be pretty risky. 5500 series GPU's from Nvidia don't have temperature sensors so you won't be able to tell your card is running too hot, you'll risk frying the card if it runs too hot. Use this topic on overclocking an Nvidia GPU instead. It's a timeconsuming process to get the maximum oc on that card, but also the safest. (Ignore the stuff about the latest drivers being 81.95, they're 93.71 at this moment)Good luck, Marcus_X
Thanks for the link, Marcus_X! Yeah, the 5500 series cards don't have thermal sensors, which is sad. But does anyone know how coolbits detects the optimal frequency? I mean - is it like "Oh, you've got fx5500, so you'll get that frequency" or is it based on some detection in its pure meaning? You see, coolbits says that my 'optimal' frequencies are Core:315MHz and Memory:811Mhz(!) --for the memory - twice as much as the stock - and, considering that the 5500 is just a OCed version of the 5200 - h34r: !
I've found that the automatic clock speeds the drivers detect is way off for stability, so, IMO, the best thing to do is OC manually. Yeah...it, uh...build character!