Will Advanced Micro Devices' (AMD) Bet on Super-Fast Memory Pay Off?

Advanced Micro Devices has been struggling against rival NVIDIA in the GPU market, losing market share and reporting dismal fourth-quarter earnings. AMD is expected to launch a new batch of GPUs during the first half of this year, and while I'm not optimistic that a single product launch can right the ship, the company does have a trick up its sleeve. AMD is rumored to be using high-bandwidth-memory, or HBM, in some of its new GPUs, providing drastically more memory bandwidth compared to NVIDIA's products. AMD needs any advantage that it can get, but how meaningful of an impact will HBM really have?

Why memory bandwidth is importantGPUs process a tremendous amount of data per second. While performance is ultimately limited by a GPU's processing power, a potential bottleneck is the rate at which a GPU can read data from its own memory. This rate is called the memory bandwidth.

To get a sense of how much data is involved, imagine a graphics card with 4GB of memory running a game at sixty frames-per-second. Each frame, the GPU needs to deal with all of the data necessary to draw the scene, including data representing all of the objects, like characters and buildings, and textures that are applied to those objects. Given the visual detail and richness of modern PC games, an enormous amount of data is needed to make the image on the screen look realistic.

If this hypothetical graphics cards needs to access all 4GB of its memory once per frame, the memory bandwidth needs to be 240 GB/s. This is a simplification: Some data may need to be accessed more than once per frame, and some less frequently, but this estimate is a good ballpark figure. NVIDIA's high-end GTX 980, for example, has a memory bandwidth of 224 GB/s, allowing it to access nearly all of its 4GB of memory every frame.

As the screen resolution increases, the GPU needs to deal with an increasing amount of data. A 4K monitor has quadruple the number of pixels compared to a 1080p monitor, and that means all of the per-pixel data that the GPU churns through quadruples as well.

Star Citizen, an upcoming crowdfunded PC game, will have support for not only 4K resolutions, but 8K resolutions as well. This requires the development team to include 4K and 8K textures for any object that gets close to the player; a lower-resolution texture would appear blurry. The GPU memory bandwidth required to deal with all of this data will need to be truly enormous, to say the least.

Existing graphics cards can play games at 4K resolutions, but not particularly well at high settings. NVIDIA's GTX 980 has trouble hitting playable framerates in many games, and part of the reason is likely its low memory bandwidth.

AMD could own 4K gaming, at least for a whileAMD is reportedly going to use HBM in its high-end graphics cards this year, providing a memory bandwidth as high at 640GB/s, according to unconfirmed specifications. That's nearly triple what the GTX 980 offers, and this should help AMD's new GPUs excel at 4K gaming.

NVIDIA will eventually adopt HBM, but it's unlikely to happen until GPUs based on its next-gen architecture, Pascal, are released sometime in 2016. This gap will give AMD's products a distinct advantage in the 4K gaming market for a while.

However, the ultimate importance of this advantage is questionable. The percentage of PC gamers using 4K monitors is currently minuscule, and with all of this extra bandwidth largely unnecessary at lower resolutions, being first may not lead to many extra sales for AMD. Steam's hardware and software survey puts the percentage of its users with a monitor above a resolution of 1080p at just 3.25%. Only 0.03% of users have monitors with 4K resolutions.

Another potential issue is the price. HBM is a new memory technology, which means that it's going to be expensive. We won't know how expensive until AMD's new GPUs are officially announced, but the highest-end GPU may end up being priced well above NVIDIA's GTX 980.

Now, it's possible that AMD's new GPUs will spur adoption of 4K monitors, given that most existing GPUs can't handle 4K very well today. Buy 4K monitors are still quite expensive, and 1080p monitors are far cheaper in comparison. I suspect that 4K monitors will need to become much more affordable for 4K gaming to go mainstream, and that puts a limit on the benefits potentially realized by AMD for being first to market with an HBM graphics card.

If the rumors prove true, AMD's use of HBM will certainly give it an advantage. But given the meager adoption rate of 4K monitors, it probably won't matter very much in the grand scheme of things.

The article Will Advanced Micro Devices' (AMD) Bet on Super-Fast Memory Pay Off? originally appeared on Fool.com.

Timothy Green owns shares of Nvidia. The Motley Fool recommends Apple and Nvidia. The Motley Fool owns shares of Apple. Try any of our Foolish newsletter services free for 30 days. We Fools may not all hold the same opinions, but we all believe that considering a diverse range of insights makes us better investors. The Motley Fool has a disclosure policy.

Copyright 1995 - 2015 The Motley Fool, LLC. All rights reserved. The Motley Fool has a disclosure policy.