ATI vs. Nvidia?

Discussion in 'Video Cards, Displays and TV Tuners' started by Carl Holcomb, Aug 7, 2011.

  1. Carl Holcomb

    Carl Holcomb Geek Trainee

    Likes Received:
    1
    Trophy Points:
    3
    The second biggest question that any true hardcore gamer should know. Who is better, Nvidia or ATI? Nvidia reads pictures as one flat plane of polygons making lines and corners less define only to a point that a 1080p monitor will show if you have a magnifying glass, and shadow lighting doesn't proportion right. ATI reads each picture texture as multiple polygons (also called tessellations) making the landscape creation so much slower, word going around is that Nvidia is finding a way to do that better than ATI, but we have yet to see when it eventually comes out. ATI did do a good job making landscapes a small percentage better than Nvidia. If you are a hardcore gamer and want a large frame rate at maximum graphics, then Nvidia is what you want. If you don't care if your game is going slow as hell (sorry for the hate ATI) but want exceptional graphics, ATI is what you want. Truth is, ATI graphics are so slow that games are almost unbearable to a major gamer at medium graphics. Nvidia does still does a very good job at rendering (only a professional can see any difference in the two videos) and still runs games at such a high frame rate that games look more realistic. True gamers should go with Nvidia. Movie lovers that feel like spending $700 for a little higher def movie should buy ATI. But for you simple people that aren't extreme gamers or movie watchers, just stick with on-board graphics.

    Do not quote me on the polygon generation techniques because they are always changing for performance and the above facts could possibly be out of date within a year. I no way should you tell people exactly about the polygon generation techniques of either Nvidia or ATI in risk of this post being out of date, or misconstruing information and looking like a total idiot to someone like me (someone that designs graphics).

    Do not whine to me about what company you think is better if you are comparing ATI's best video card to Nvidia's worst video card and vice versa, because in the end you will look like a total 'dumb-ass.'

    Looking to buy your very own card? Go here http://www.hardwareforums.com/threads/picking-your-perfect-video-card.28899/

    Please respond with questions or your favorite video card decision.
     
  2. sf1lonefox

    sf1lonefox Geek Trainee

    Likes Received:
    0
    Trophy Points:
    1
    I'm not sure I agree with your on that. ATI card almost always come cheaper and they come equipped with higher memory capacity and rather often, clock rates. Today what really makes the difference in the cards, and for one what sets Nvidia apart from ATI (or should I say AMD) is all the extras they throw in. Stereoscopic possibilities should be a strong seller now for Nvidia. PhysX I'm sure should more easily show you a radical drop in fps than those corners you address. Fact is many new games happily make use of this (if fact why wouldn't they). Then of course there is CUDA which makes the favorite amongst the 3D users.

    Fact still is though, do you really need all the extras that come with the card? Having had a Nvidia card most of the time, I can say No to that. Until I get a 3D screen, I do not need stereoscopic, my 3D app does not use CUDA, and only PhysX is the one I often use. However I'm sure my next pc will have a 3D monitor, so then I can say yes to those. Meaning the next card will most likely be a Nvidia.

    Fact still is though, ATI has been doing a far better job than Nvidia (just looks at the benchmarks for the last 3 years). Were it not for the extras, Nvidia would barely hold her own.
     
  3. Carl Holcomb

    Carl Holcomb Geek Trainee

    Likes Received:
    1
    Trophy Points:
    3
    The Nvidia benchmarks for the 400 and 500 series have been running practically even with the ATI Radeon 5000 series and 6900 series. The recent benchmarks for Nvidia's top three cards compared to ATI's top three cards have shown that Nvidia is almost parallel to ATI on just about every platform other than a few where Nvidia dominates ATI.

    If you were to compare percentages within those benchmarks Nvidia's average performance would be within 6% greater than ATI's average performance.

    Now a days clock rates makes very little difference. You have to look at shader rates (which ATI doesn't use), memory clocked speed, stream processors, memory interface, pixel pipelines, and memory type (GDDR3 and GDDR5). All of those go into the quality of the game play. You can have the highest clock rate in the world and a 64-bit memory interface with 128-stream processors, and that would make it a relatively terrible card. If those are the additives you were referring to that Nvidia always adds, then those are what makes a great card. ATI generally does have more memory and higher clock speed, but their stream processors, memory clock, and memory interface is much lower than Nvidia. Nvidia also has the famous shader-clock which literately triples clock speed, but does it separately, so a video card like mine that has a 810Mhz clock and a 1620MHz shader-clock giving it a grand total of 2430MHz, but since it does it separately Nvidia is forced to only label the original core speed, so Nvidia generally has way faster GPU speed unlike what you tried to express.

    If you were to spend the $1400 to quad crossfire the Radeon Sapphire HD 6990 and compare it to the Nvidia quad SLI GTX 590 for $1400, you would notice that each Radeon card has one more gig of memory then its Nvidia counterpart, so with the Radeon setup you would have 8Gb of memory compared to the 6Gb of the Nvidia setup. The Nvidia setup would be able to create more polygons with better measurements because of the CUDA cores with a Fermi, making Nvidia better at polygon creation. The Nvidia memory clock is higher, and when you have 6Gb of video memory you will never need anymore, making the Nvidia cards best when it comes to memory rating (memory speed combined with total memory). When comparing the two setups, the frame rate will be virtually the same, whether you have the V-sync on or not, so at such extremely high bench-marking cards you will likely see no difference at all.

    When your looking at it from a cheap perspective, Galaxy (side company of Nvidia) makes the best cards for the money by far. A $70 Galaxy GT 440 is paralleled by most $200 ATI cards.

    When you are looking at it from a gamer's perspective where money is not an option, EVGA (another side company of Nvidia) makes top of the line polygon creators which is what you want more than anything for gaming.

    Another thing people like to do is use a motherboard that will support the use of both video cards. Using an Nvidia card for PhysX and polygon measurements to take a hefty load off of the ATI card that separates all textures into more polygons so that Nvidia can create the polygons. There is no need to (if its even possible) to try to SLI or Crossfire them, because they work separately as long as you pick the Nvidia GPU for PhysX. Keep in mind that keeping drivers up to date and working at all times will be an arduous task, along with making sure that you keep the Nvidia the main video and making it the PhysX card too and just using the ATI for multiple generated polygons.

    Not all motherboards and not all ATI-Nvidia combos will work with the two brands in at the same time, so please exercise caution.

    Do the smart thing and go with Nvidia, not for the extras (like 3D), but for the hardcore gaming capabilities, because in the end ATI cannot compete with Nvidia and never has been able to, which is why Nvidia has always been the leader in performance.
     

Share This Page