A spy shot of the GeForce GTX 1060. Image source: BenchLife.info.
Graphics specialist NVIDIA is expected to launch its next generation mid-range desktop graphics processor, known as the GeForce GTX 1060, sometime this month. Though the general launch time frame had been leaked, the specifications and expected performance of the card weren't known.
Thanks to a leak of some slides that look as though they were produced by NVIDIA from graphics card focused website VideoCardz.com, we now know the key details of this upcoming card. Let's take a closer look.
According to the leak, the GeForce GTX 1060 packs 1280 CUDA cores, exactly one-half of the number of CUDA cores found inside of NVIDIA's flagship GeForce GTX 1080, running at a boost speed of 1.7GHz. It also comes with six gigabytes of GDDR5 memory rated at 8Gbps speeds. Assuming a memory bus width of 192-bits, VideoCardz calculates the memory bandwidth available to the GPU at 192 gigabytes per second -- 60% of the bandwidth available to the GTX 1080.
One of the slides that VideoCardz shows reads, "the power of GTX 980 for every gamer," which likely means that in terms of performance, the 1060 will be as fast as NVIDIA's late-2014 flagship GPU, the GeForce GTX 980.
The card itself uses a single six-pin PCI Express power connector and is rated at a 120 watt thermal design power. This means that the card is much more efficient than the GeForce GTX 980 (rated at 165 watts and required two six pin power connectors), something that will make life easier for users trying to upgrade systems with relatively low quality power supplies.
In terms of display connectors, the slide shows that the GTX 1060 packs three DisplayPort 1.4 outputs, an HDMI 2.0B output, as well as a dual-link DVI connector. This should ensure compatibility with a wide range of displays.
Cost reduced GTX 980
I believe that if the GTX 1060 is able to hit GeForce GTX 980 level performance across the board at the rumored price of $299, it should do well in the marketplace. What's important to note, though, is that the GeForce GTX 1060 should be cheaper to produce than the GTX 970/980, so NVIDIA should be better off bringing this part to market than selling dramatically price reduced GTX 980s into this market.
Let's explore this a little bit more.
The GTX 980 was built on TSMC's 28-nanometer process and measured in at 398 square millimeters. The GTX 1060, assuming it's based on the GP106 chip, is estimated to measure in at somewhere between 170 square millimeters and 200 square millimeters on TSMC's newer 16-nanometer FinFET process (this estimate comes from VideoCardz.com as well). Let's call it 200 square millimeters to err on the side of caution.
Analyst Handel Jones estimates that a 16-nanometer wafer in late 2016 should run about 70% more than a 28-nanometer wafer by the end of 2014, implying a 70% increase in cost per area. If we take this estimate to be true, this should imply that a 200 square millimeter GTX 1060 chip should actually cost less to manufacture than a 398 square millimeter GTX 980. It's also worth noting that, all else equal, smaller chips tend to yield better than larger chips.
From a raw die costperspective, it would seem that the GTX 1060 should be cheaper to make than the GeForce GTX 980.
Beyond the die cost, the board should also be cheaper to make. The GTX 1060 has a narrower memory bus than the GTX 980 (192-bit versus 256-bit), which should lead to a reduction in complexity of the board that the graphics processor is mounted on. Additionally, the GTX 1060 requires less power than the GTX 980 does (120 watts versus 165 watts), which should allow NVIDIA and board partners to use less expensive power delivery systems and save on a PCI Express power connector.
The only area where cost could increase is on the memory chip side of things; the GTX 980 comes with just four gigabytes of GDDR5 while the GTX 1060 should come with six gigabytes. That said, the GTX 980 launched before Samsung announced the mass production of 8 gigabit GDDR5 memory chips.
Samsung claimed in its January 2015 press release that its 8 gigabit GDDR5 chips were the "industry's first." Generally speaking, higher density memory chips tend to lead to lower cost per bit. In light of this, the six gigabytes of memory on the GTX 1060 may wind up no more expensive (or possibly even cheaper) than the four gigabytes of memory on the GTX 970/980.
The article NVIDIA Corporation's GTX 1060 Details Leak originally appeared on Fool.com.
Ashraf Eassa has no position in any stocks mentioned. The Motley Fool owns shares of and recommends Nvidia. Try any of our Foolish newsletter services free for 30 days. We Fools may not all hold the same opinions, but we all believe that considering a diverse range of insights makes us better investors. The Motley Fool has a disclosure policy.
Copyright 1995 - 2016 The Motley Fool, LLC. All rights reserved. The Motley Fool has a disclosure policy.