+12 Rail Amp Rating

persondude2

Geek Trainee
I'm looking into buying a new video card, and the one that I want, the EVGA GeForce 8800GTS 640MB, says that it needs a "Minimum recommended power supply with +12 Volt current rating of 26 Amps."

What exactly does this mean? The highest amp rating I have seen for a power supply on Newegg is +12V@20 amps. Can I use two +12V 20 amp wires to power the card??

Thanks in advance for your help.
 
What the requirements mean is that the total combined amperage must be 26 amps. The card itself doesn't require the full wattage that 12V x 26A provides, but most of the components in a modern PC run off the +12V. For safety reasons, the ATX 2.2 spec (if I remember correctly) places a limit on 20A per +12V rail. If you get a power supply with, say 2x 18A-20A, +12V rails, you should be fine.

There are a few power supplies (like the Silverstone Olympia series) that don't use multiple +12V rails in favor of a single high amp one (50-60A :eek: ), but you'll find that most use multiple +12V. As long as you get it one way or another, you'll be fine.
 
Thanks for your thorough reply, however I am still a little confused. I asked this same question on other hardware forums and I am getting many conflicting answers.

One person posted a link to this power supply faq, which appears to state the opposite of what you said:
Here are the facts: A large, single 12-volt rail (without a 240VA limit) can transfer 100% of the 12-volt output from the PSU to the computer, while a multi-rail 12-volt design has distribution losses of up to 30% of the power supply’s rating. Those losses occur because power literally gets “trapped” on under-utilized rails. For example, if the 12-volt rail that powers the CPU is rated for 17 amps and the CPU only uses 7A, the remaining 10A is unusable, since it is isolated from the rest of the system.

Since the maximum current from any one 12-volt rail of a multiple-rail PSU is limited to 20 amps (240VA / 12 volts = 20 amps), PCs with high-performance components that draw over 20 amps from the same rail are subject to over-current shutdowns. With power requirements for multiple processors and graphics cards continuing to grow, the multiple-rail design, with its 240VA limit per rail, is basically obsolete.
 
Depends on the efficiency. Better units tend to offer more efficient designs, so the loss is minimal.

The 8800GTX, which draws more power than the 8800GTS consumes up to 185W under load. 12V x 20A=240W, which is more than enough to cover it. The second rail would service the CPU, SATA drives and fans.

Take a look at SLI Zone's power supply listing. It will show what power supplies are SLI Certified for which graphics cards.

What you linked is true, but what I said is also true. The 8800GTS big brother consumes at most 185W, although some have said it's more around 150W. Assuming it's on a +12V rail with 20A, you have 240W available.
A separate rail would feed the CPU, hard drives and fans. A high-end CPU may consume, 100-110W, while fans and hard drives ask around 5W.

The link is correct that there's unused wattage there. However, it only becomes a problem if the power draw on a rail exceeds what it can provide. Most good power supplies are 80% or greater when it comes to efficiency. This would allow at least 192W.

Also, what you linked is from PC Power & Cooling, who, while do produce some good power supplies also seem to yap about a lot that has come to be less of a deal than they've made it out to be. It is biased in the way it's written.
 
Back
Top