Follow us on FaceBook

Hardware report card: Nvidia vs AMD

Nvidia vs AMD card funnier

The AMD R9 Fury X is here, marking the first time since 2013 that both AMD and Nvidia have had new, competitive GPU hardware out at the same time. The GTX 980 Ti and the Radeon R9 Fury X are both priced at $650 and aimed at high-end 1440p and 4K gaming. The Fury X represents a huge release for AMD, while Nvidia’s been releasing a steady stream of Maxwell graphics cards since late 2014. Against that backdrop, we thought it was time for a check-in: where do AMD and Nvidia stand right now, where are they going, and most importantly, whose graphics cards should you buy?

First off, neither company deserves your abiding loyalty. Don’t blindly follow AMD or Nvidia from one graphics card release to the next, singing their praises over the competition. It should go without saying that you should head where the best product is. And when we’re talking graphics cards, the best product isn’t just about hardware. It's about driver performance and updates. It's about software and exclusive features. And it's about the accessories that work with those graphics cards, like G-Sync and FreeSync monitors.

In this feature I've taken a look at each of these areas and graded AMD and Nvidia on how they stand right now. Here are the categories; click a link to jump to the page in question.


Report card table of contents

The graphics cards
Graphics driver updates and performance
Graphics driver software
Exclusive features: Gameworks and TressFX
FreeSync vs. G-Sync


Amd R9 Fury X Side

The graphics cards

It can be tough to keep track of every card in the GPU market. Here are the ones you should know about right now, divided into some rough performance tiers. I’ll talk about how Nvidia and AMD cards match up in performance and price beneath each chart.

High-end

Specs R9 Fury X GTX 980 Ti GTX Titan X GTX 980
VRAM 4GB HBM 6GB GDDR5 12GB GDDR5 4GB GDDR5
Stream processors/CUDA 4096 2816 3072 2048
Base clock 1050 MHz 1000 MHz 1000 MHz 1126 MHz
Boost clock 1050 MHz 1075 MHz 1075 MHz 1216 MHz
Memory clock 500 MHz (1 GHz effective) 1750 MHz (7GHz effective) 1750 MHz (7GHz effective) 1750 MHz (7GHz effective)
Memory bandwidth 512 GB/s 336 GB/s 336 GB/s 224 GB/s
Texture units 256 176 192 128
ROP units 64 96 96 64
Compute 8.6 TFLOPS 5.6 TFLOPS 6.14 TFLOPS 4.6TFLOPS
TDP 375W 250W 165W 250W
Price $650 $650 $1000 $500
Specs R9 390X R9 295X2
VRAM 8GB 4GB x 2
Stream processors/CUDA 2816 2816 x2
Base clock 1050 MHz 1018 MHz
Boost clock 1050 MHz 1018 MHz
Memory clock 1500 MHz 1250 MHz
Memory bandwidth 384 GB/s 320 GB/s x 2
Texture units 176 176 x2
ROP units 64 64 x2
Compute 5.9 TFLOPS 11.5 TFLOPS
TDP 275W 500W
Price $429 $630

AMD releasing the R9 Fury X at $650 was an aggressive move, matching the 980 Ti at the high-end. The Fury X’s power usage is right in line with the 980 Ti, which is a great improvement over AMD’s last generation of cards. And it comes very close on performance, but not quite close enough for us to recommend it over the 980 Ti. Improved drivers may give the Fury X a performance edge, but currently it’s lacking overclocking potential and frametime consistency, and the 4GB of memory is going to be a limitation in future games. As our colleagues at Maximum PC summarized, “Fury X looks great on paper, but it needs fine-tuning and drivers to realize its potential.”

The 980 Ti, meanwhile, is a much smarter buy than the more expensive Titan X, offering very similar performance for $450 less. The 6GB framebuffer is ample for most of today’s games even at 4K, and by the time you’re actually using 12GB of VRAM, you’ll probably want a newer, faster card than the Titan X. The other extremely high-end option is AMD’s R9 295X2, a dual-GPU card that can still slightly outperform the mighty Titan X, and now costs only $630. That’s a ton of performance for the price, but it has drawbacks. As we covered in our Titan X review, the dual-GPU 295X2 posts much lower minimum framerates with some severe frametime stuttering problems, draws huge amounts of power, and needs some serious cooling to stay operational. It’s a beast of a card, but from an efficiency standpoint, the the 980 Ti and Titan X are better buys.

Finally, we’ve got the GTX 980 and the R9 390X. AMD’s new 300 series cards are actually rebrands of the previous 200 series graphics cards, using the same GPU technology. The GPUs are old, but AMD has made some tweaks, including adding memory up to 8GB on the 390X (and the 390, which we’ve put in the mid-range category) and clocked that memory 20 percent faster than the otherwise virtually identical 290X. As Maximum PC’s review of the 390X shows, the card comes close to the GTX 980’s performance, as the 290X did before it...but the GTX 980 is a much stronger overclocker, while running quiet and drawing far less power.

The 390X is priced aggressively, and the 8GB of VRAM is a nice touch, but it’s hard to say how valuable it will be. The 390X is definitely a card better suited to high-end 1080p and 1440p gaming than 4K gaming, where that extra memory is especially valuable. You could save even more money by buying an older 290X with 4GB of memory if you’re gaming at 1080p.

While we don’t know what Nvidia cards will be coming to market in the next six months, we do know that AMD has the R9 Fury coming in July, which is $100 cheaper than the Fury X. They’re the same cards, but the Fury is air-cooled instead of liquid cooled. There’s also the small form-factor Fury Nano, and a dual-GPU Fury, coming in the fall. Those cards will give AMD a much stronger line-up, and at $100 cheaper than the 980 Ti, the R9 Fury will potentially be a great high-end buy.

NVIDIA GeForce GTX 970

Mid-range

Specs GTX 970 R9 390
VRAM 4GB (3.5GB full bus-speed) 8GB
Stream processors/CUDA 1664 2560
Base clock 1005 MHz 1000 MHz
Boost clock 1178 MHz 1000 MHz
Memory clock 1750 MHz 1500 MHz
Memory bandwidth 196 GB/s (3.5GB)28 GB/s (512MB) 384 GB/s
Texture units 104 160
ROP units 56 64
Compute 4 TFLOPS 5.1TFLOPS
TDP 145W 275W
Price $330 $329


AMD matched the price of the GTX 970 with its R9 390, which, like the rest of the 300 series, is a slightly tweaked version of the older 200 series cards with 8GB of VRAM. Its core clock is a bit higher, and its memory clock is increased 20 percent.

Price-wise, 8GB of VRAM at $330 is a hell of a bargain. Still, the GTX 970 vs. the 390 is pretty much the same as the GTX 970 vs. the 290: it’s going to win on overclocking, volume and power efficiency with slightly better performance. And neither card is suited to 4K gaming, which is where more VRAM is especially valuable. The GTX 970 is the better card, though it’s possible that, two years from now, its 3.5GB of full-speed memory will be a limiting factor in memory-hungry games.

Sapphire R7 370

Low-end


Specs GTX 960 R9 380 R7 370 R7 360
VRAM 2GB / 4GB 2GB / 4GB 2GB / 4GB 2GB
Stream processors/CUDA 1024 1792 1024 768
Base clock 1127 MHz 970 MHz 975 MHz 768 MHz
Boost clock 1178 MHz 970 MHz 975 MHz 768 MHz
Memory clock 1752 MHz 1375 MHz / 1425 MHz 1400 MHz 1625 MHz
Memory bandwidth 112 GB/s 182.4 GB/s 179.2 GB/s 104 GB/s
Texture units 64 112 34 48
ROP units 32 32 32 16
Compute 2.3 TFLOPS 3.48 TFLOPS 2 TFLOPS 1.61 TFLOPS
TDP 120W 190W 110W 100W
Price $199 / $240 $199 / $220 $149 / $180 $109

In the realm of cheaper graphics cards, AMD overwhelms Nvidia with options. The GTX 960 is a good card, and its Maxwell GPU will draw less power than the AMD GPUs while delivering much better performance than the R7 370 and 360, which are very much budget cards. The R9 380 outperforms it by about 5 - 10 fps, according to Tom’s Hardware’s tests, though the GTX 960 can close some of that gap by overclocking.

Thankfully, there are 4GB versions of both the GTX 960 and the R9 380, which we’d recommend: 2GB of VRAM just isn’t enough these days. The R9 380 is a strong performer for its price, and the better bargain overall, though for a budget build, you may need a stronger power supply than you’d buy for a GTX 960 setup.

Nvidia: A

Nvidia’s Maxwell GPU is power efficient and fantastically overclockable, with a hard-to-beat option in each pricing tier.

AMD: B

The new Fury X is strong, but the rest of AMD’s aging, rebranded cards are outmatched against the more efficient Nvidia GPUs, though priced well. AMD’s Fury cards coming in the fall may help level the field.


Report card table of contents

The graphics cards
Graphics driver updates and performance
Graphics driver software
Exclusive features: Gameworks and TressFX
FreeSync vs. G-Sync

Nvidia GeForce GTX Titan X SLI

Graphics driver updates and performance

I remember buying a $2000 laptop in 2006 with grand designs: its AMD processor and ATI GPU would be powerful enough to play games that the average laptop would choke on. And I also remember the neverending nightmare that dealing with drivers for that graphics card turned out to be: I eventually had to go to third-party drivers to get the laptop to play nice with my TV. Drivers were a problem for ATI a decade ago, and they still seem to be a problem for AMD today.

As our colleagues at Maximum PC pointed out in their Fury X review, Nvidia released 10 WHQL GameReady drivers in 2014 and has released eight in the first half of 2015. AMD released four in 2014, and hasn’t released a single one since December. AMD does put out beta driver releases roughly once per month, but they can and should do better about optimizing for new game releases and getting those out to all gamers, not just the ones seeking out beta releases.

Nvidia’s GameReady drivers aren’t guaranteed to offer great performance for a new game. Sometimes performance improves significantly in driver releases post-launch, and sometimes those launch-day drivers have issues with SLI performance or other graphical features. But Nvidia is better than ever at frequently updating its drivers, which means those launch problems are less frequent and fixed more quickly. Nvidia has also worked hard in recent years to improve driver support for multi-GPU SLI setups.

Nvidia Geforce Experience Driver

When The Witcher 3 was released, some Nvidia card owners pointed out that the game performed poorly on 700 series Kepler GPUs, and that Nvidia was deliberately crippling Kepler’s performance to push sales of its newer Maxwell cards. Afterwards, Nvidia released the 353.06 driver that delivered some performance improvements on Kepler cards. As for Nvidia sabotaging Kepler’s performance over time, it’s hard to find any real evidence of that, and that assumption ignores the incredible complexity of modern games, the millions of lines of code that go into making drivers compatible with a vast array of games and hardware, and the changes in GPU technology that make newer hardware more efficient at certain tasks. From what we’ve seen, Kepler performance has remained equivalent, or even improved slightly, with driver releases since 2013.

Frequent driver releases are important, but not as important as driver performance. While most benchmarks use average framerates as a guide to a graphics card’s general performance, minimum framerates and frametimes tell a story that’s just as important: how smooth or stuttery a game’s going to be. And this is where AMD’s drivers fall short.

When PCPer first explored the importance of measuring frametimes, it showed that AMD’s cards (particularly in CrossFire) posted far more varied frametimes than Nvidia’s. Things aren’t as bad today, but even the brand new R9 Fury X is struggling to produce even, consistent frametimes. Check out TechReport’s detailed benchmarks to see how AMD’s flagship card compares to the much smoother 980 Ti and Titan X.

Their conclusion: “the Fury X struggles to live up to its considerable potential. Unfortunate slowdowns in games like The Witcher 3 and Far Cry 4 drag the Fury X's overall score below that of the less expensive GeForce GTX 980. What's important to note in this context is that these scores aren't just numbers. They mean that you'll generally experience smoother gameplay in 4K with a $499 GeForce GTX 980 than with a $649 Fury X. Our seat-of-the-pants impressions while play-testing confirm it. The good news is that we've seen AMD fix problems like these in the past with driver updates, and I don't doubt that's a possibility in this case. There's much work to be done, though.”

AMD Catalyst Omega header

AMD needs to improve both its frequency of driver releases and its frametime consistency to match Nvidia. Easier said than done, of course, because Nvidia has a huge army of software engineers working on its drivers. But until AMD improves in this area, drivers will continue to be a knock against buying its hardware.

Nvidia: A

Nvidia’s drivers aren’t always perfect, but they release new updates frequently, are on the ball for big game launches, and generally improve performance with new releases.

AMD: C

AMD’s drivers lag behind on performance, in general, and are also less frequently updated, which is especially problematic for game performance on launch day.


Report card table of contents

The graphics cards
Graphics driver updates and performance
Graphics driver software
Exclusive features: Gameworks and TressFX
FreeSync vs. G-Sync

Nvidia Geforce Experience Screen

Graphics driver software

Since Nvidia introduced the GeForce Experience, it’s been continuously adding useful features to the software. In-app driver updating is convenient. Game optimization is nice if you don’t like messing with settings, though it’s our least-used feature. Dynamic Super Resolution, enabled in GFE or in the Nvidia Control Panel, is a convenient downsampling solution for running games at higher resolution. Most importantly, there’s ShadowPlay, which can do DVR background recording and manual recording at up to 4K 60fps and 130 megabits. 4K recording is currently extremely demanding even for SLI Titan Xs, but it’s an option.

To AMD’s credit, they’ve attempted to match Nvidia’s features with Gaming Evolved, but the software isn’t as good. GeForce Experience is a utility: it does what you want it to, and doesn’t bother you with anything extraneous. AMD Gaming Evolved, by comparison, ties into Raptr, and wants you to use the social gaming service. It advertises free-to-play games within a busier, messier UI. And its video capture solution is limited in quality and options compared to Nvidia’s, topping out at 1080p.

AMD Gaming Evolved is unpleasant to use: it feels like software built out of sponsorship opportunities rather than a tool made to improve the experience of using AMD’s hardware.

Amd Gaming Evolved

Amd Catalyst Control Center

AMD does have one software strength, and that’s overclocking within the AMD Catalyst Control Center. Overclocking Nvidia cards requires downloading separate software, though those tools, like MSI Afterburner and EVGA PrecisionX, give you lots of valuable information on the card’s voltage, temperature, clocks, etc. Catalyst Control Center’s overclocking utility is more basic, but it does let you quickly and easily overclock its graphics cards. That’s a plus.

Both Nvidia and AMD support multi-monitor gaming through Nvidia Surround and AMD EyeFinity, but we haven’t spent much time with either recently to know how they compare in support and ubiquity. You can find a list of games on the Widescreen Gaming Forum that have native multi-monitor support.

Nvidia: A+

Nvidia’s GeForce Experience makes driver downloads easy and offers some great features that are regularly improving.

AMD: C+

AMD has matched some of Nvidia’s software features, but in an application filled with ads and extraneous social elements.


Report card table of contents

The graphics cards
Graphics driver updates and performance
Graphics driver software
Exclusive features: Gameworks and TressFX
FreeSync vs. G-Sync

Nvidia Hairworks Witcher 3 Wolves

Exclusive graphics features: Nvidia GameWorks and AMD TressFX

For years, AMD and Nvidia have been landing exclusive partnerships with game developers to optimize games for their hardware. An Nvidia or AMD logo appears before the title screen, and framerates often favor the partner’s graphics cards thanks to close collaboration. More recently, both have developed some exclusive technology to feature in those games that also performs better on their own hardware.

For AMD this was TressFX, used for Lara Croft’s hair in Tomb Raider. Nvidia has a much wider library of technology called GameWorks that it offers to game developers. GameWorks includes post-processing like TXAA and HBAO+, HairWorks for hair and fur rendering, PhysX for GPU-accelerated physics, and more. As proprietary software, GameWorks is naturally optimized for Nvidia’s hardware. And this causes some problems.

The most basic problem: some GameWorks features aren’t available to AMD card owners. That sucks for AMD owners, but it does make sense: Nvidia invests a good deal of time, manpower and money into developing these proprietary technologies. And some features like Nvidia HairWorks that do work on AMD cards tend to brutalize AMD’s performance. AMD has called this intentional sabotage, while in other instances developers have blamed AMD for not working with them to optimize game performance.

This passage from ExtremeTech does a good job of summarizing this issue:

“Whether GameWorks impairs performance on competing hardware depends entirely on how you parse the meanings behind the statement. Nvidia would argue that incorporating a GameWorks library doesn’t intrinsically impair competing hardware because HairWorks is an optional, user-controlled setting. Integrating a GameWorks library doesn’t impact performance in the title as a whole — if HairWorks doesn’t run properly on an AMD chip, turning it off fixes the problem (from Nvidia’s perspective, at least).

AMD would argue that “impair” has a wider meaning than Nvidia has assigned it. As we discussed last year, one of the differences between the two firms is that AMD tends to rely more on source code optimization than Nvidia does, which means closed-source libraries tend to hurt Team Red more than Team Green. Claiming that AMD could write its own libraries to accomplish essentially the same tasks ignores the fact that game developers adopt libraries to simplify their lives, and relatively few of them are willing to adopt two different solutions for doing the same thing. Practically speaking, AMD doesn’t have the financial resources to attack Nvidia head-on in this fashion in any case. Thus, from AMD’s perspective, GameWorks absolutely impairs its own ability to optimize code.”

AMD is more of a proponent of open source software than Nvidia, but that’s the way it goes with the underdog—AMD doesn’t have the same investment in software engineering that Nvidia has with GameWorks. Still, they put their money where their mouth is with TressFX, making it open source and working to ensuring that it can run as well on Nvidia hardware as AMD’s own.

Amdtressfx

Nvidia’s GameWorks libraries may be good for game developers, and it’s fair for Nvidia to keep that code proprietary. Nvidia’s assistance in development and driver optimization, with engineers embedded within development studios, is also a good thing. But the way games are increasingly optimized along Green Team and Red Team party lines is bad for PC gaming overall. It’s limiting and punishing to some players, often for the sake of marketing and branding opportunities. Both companies are guilty of this, and it’s unlikely to stop any time soon.

Nvidia: B-

Graphics libraries and support help developers, but exclusivity can be harmful to the platform as a whole.

AMD: B

More limited graphics features and optimization issues aside, AMD is more committed to open source with tech like TressFX. But it still makes exclusivity deals that can be harmful to the platform.


Report card table of contents

The graphics cards
Graphics driver updates and performance
Graphics driver software
Exclusive features: Gameworks and TressFX
FreeSync vs. G-Sync

Acer Xg270hu

Variable refresh technology: G-Sync vs. FreeSync

Nvidia’s G-Sync is a great variable refresh technology. It allows a monitor with a special G-Sync board to sync its refresh rate to the output of an Nvidia graphics card, presenting a smooth framerate from 30-144 Hz (there are some 4K G-Sync monitors that can’t refresh up to 144 Hz, but the technology supports it). It’s a proprietary solution that’s also a bit expensive, as it requires that hardware module installed in the monitor. But variable refresh is a fantastic technology, and once you see the smoothness, it’s very difficult to go back.

AMD has its own solution called FreeSync, which uses the VESA Adaptive Sync DisplayPort standard. G-Sync only works with Nvidia graphics cards, and FreeSync only works with AMD graphics cards. The difference is that FreeSync does not require any added hardware, making it more open and ideally cheaper in the long run.

G-Sync has its own advantages, as this PCPer dissection of the two technologies shows. G-Sync’s dedicated hardware gives it an edge at low refresh rates, and Nvidia also recently introduced G-Sync support on laptop screens and added support for windowed mode. Nvidia is also ahead of AMD in monitor availability, but there are more monitors featuring both technologies on the way.

Both technologies are a great improvement to your gaming experience.

Nvidia: A-

G-Sync’s technology is more powerful and there are currently more monitors available, but at a higher price.

AMD: B+

FreeSync is currently a slightly more limited technology, but monitors are available at better prices.

Conclusion

AMD's big play with the Fury X and High Bandwidth Memory have at least brought them back into relevant competition with Nvidia, but there's still a lot of work left to do. The 300 series cards are rebrands of older cards with great prices and some nice bumps in VRAM, but none of the architectural or power improvements of Nvidia's graphics cards. Hardware-wise, AMD's next six months look promising, with several more Fury cards on the way.

Software remains Nvidia's real strength. Their drivers are simply better than AMD, and they're updated more frequently. The GeForce Experience is a great piece of software, and AMD Gaming Evolved is not. AMD needs to up its driver game to improve frametimes in demanding games and get day-one drivers ready for major releases.

Unfortunately, both companies are likely to continue to make exclusive deals with game developers to optimize their game for one graphics company or the other, which is bad for PC gaming as a whole. More optimistically, the benefits of High Bandwidth Memory will have an even bigger impact on graphics cards as AMD and Nvidia adopt HBM 2.0 in the future—hopefully in 2016. AMD needs to continue improving its drivers and capitalize on its early start with HBM before Nvidia comes out with its own implementation.


Report card table of contents

The graphics cards
Graphics driver updates and performance
Graphics driver software
Exclusive features: Gameworks and TressFX
FreeSync vs. G-Sync



from PC Gamer latest stories http://ift.tt/1NxQ9s6
Share on Google Plus

About Unknown

This is a short description in the author block about the author. You edit it by entering text in the "Biographical Info" field in the user admin panel.
    Blogger Comment
    Facebook Comment