NVIDIA RTX Pricing | Stroke of Genius or Victim of the Cyrptocurrency Aftermath?

Unless you've been living under a rock, you've probably heard about NVIDIA's new RTX 20 series cards, which were announced on the 20th August 2018. I won't bore you with the specs, but they're very expensive.

NVIDIA GeForce RTX 2080 Ti Founder's Edition Card (image from PCWatch)

On August 20th at a special event in Germany, NVIDIA unveiled the RTX 2070 (MSRP of $499/$599 for FE), RTX 2080 (MSRP of $699/$799 for FE) and finally the big-daddy of them all, the RTX 2080 Ti (MSRP of $999/$1199 for FE) - this new lineup ditches the GTX branding in favour of the RTX branding, these cards have new features that enable RTX effects, such as Ray Tracing in many upcoming AAA titles, it should be noted RTX support does not necessarily indicate Ray Tracing support.

With that out the way this is a pretty interesting launch, the insane pricing aside, which we'll get to later, NVIDIA don't launch the x80 Ti cards this early on. Here's a brief overview of previous x80 Ti pricing and launch schedule.

The GTX 780 Ti launched on November 7th 2013 for $699, 5 months 15 days after the release of the GTX 780, which launched for $649.

The GTX 980 Ti launched on June 2nd 2015 for $649, 8 months 14 days after the release of the GTX 980, which launched for $549

The GTX 1080 Ti launched on March 10th 2017 for $699, 9 months 11 days after the release of the GTX 1080, which launched for $599

                           GTX 780 Ti (left), GTX 980 Ti (middle) and GTX 1080 Ti (right)

As you can see, the x80 Ti cards have followed a pretty similar pricing structure and release schedule since they were first conceived with the second Kepler generation back in 2013. The RTX 2080 Ti not only represents a 71% price increase over the GTX 1080 Ti, but also was announced at the same time as the x70 and x80 variants; a new for NVIDIA's release schedule, as the x80 Ti cards usually follow some months later.

Infact if look back, NVIDIA's pricing has been pretty stable for the x60, x70 and x80 cards since dating back to the Fermi era, even before then during the 200 series and prior, prices were very stable.

GTX 480/580/680 ($499), GTX 780 ($649), GTX 980 ($549), GTX 1080 ($599)

GTX 470/570 ($349), GTX 670/770 ($399), GTX 970 ($329), GTX 1070 ($379)

GTX 460/660 ($229), GTX 560/960 ($199), GTX 760 ($249), GTX 1060 (3GB $199/6GB $299)

As you can see, prices have been pretty stable from Fermi to Pascal, and even before then -- so why would NVIDIA deviate from this pricing structure, even though it's served them well for over a decade; it keeps the cards affordable at various price points while also allowing NVIDIA to maintain high profit margins.

Well, it goes without saying the Turing dies are pretty big, the RTX 2080 Ti (TU102) die measures in at 754mm2, which makes it the second largest mass produced GPU die in history, right behind the monolithic 815mm2 GV100 die, which takes the top spot. Both of these monsters have been fabbed using TSMC's 12nm FFN node, which has been specifically customised by NVIDIA and TSMC for efficiency and allowing such big retical limits.

TU102 die ~ 754mm2


The RTX 2080 (TU104) die size measures in at 538mm2, we do not know the size of the RTX 2070 (TU106) die as of the time of writing. While these cards are being manufacturer on the mature 12nm FFN process, the die size will impact the economics of scale for these cards, the larger the die size, the lower yields which means higher prices. And Turing dies are much larger than Pascal's dies, the top end GP102 die used for the 1080 Ti and Titan X/ Titan Xp measured in at 471mm2

TU104 die ~ 538mm2


Images from WCCFTech and Expreview

Some will like to point out NVIDIA have previously sold cards with similar die sizes for cheaper in the past, but it's not as simple as this, and isn't a valid comparison as a 500mm2 die on 16/12nm costs more to produce than a 500mm2 die on 45nm; wafer prices are increasing, R&D prices for new nodes are increasing and even yields are decreasing as we approach the 'end of Moore's Law'.

Another factor will no doubt be the additional cost GDDR6 adds over GDDR5 and GDDR5X, in recent months the price of GDDR5 and GDDR5X has risen, and GDDR6 is meant to be 20% more expensive compared to GDDR5, at least for the initial launch.

Consider next that we know GDDR5 memory prices were about $25 lower for video card makers last year, when we previously had insider quotes at around $50 to outfit a card with 8GB of GDDR5. This is reflected in some MSRPs that have officially been increased.

Increased memory costs for GDDR6 will also be an impact for RTX pricing.

Another reason could be AMD's lack of competition, the RX Vega series of GPUs was late, consumed a lot of power and ran hot, while offering comparable performance to the GTX 1070 and 1080. For many months the price of Vega was also inflated due to low HBM2 supply -- perhaps NVIDIA saw an opportunity, AMD is likely to be out of the game (at least in the high end segment) until at least Navi launches sometime in 2019.

Another possibility is Pascal (10 series) overstock. If rumours are to be believed, during the Cryptocurrency Mining epidemic between Q3 2017 and Q2 2018, NVIDIA decided to increase Pascal production, hoping to ease the shortage of graphics cards and allow prices to return to normality; during this period RX 580s and GTX 1060s, midrange cards were regularly hitting $500-$700 from official outlets, and third party sellers often had them listed between $1000 and $1500.

It basically made creating an affordable mid-range PC impossible, on subreddits such as /r/buildapc and /r/pcmasterrace you had users recommend those looking into joining PC should just forget about it, get an Xbox One or PS4 and come back once prices return to normal.

I think during the mining craze NVIDIA ordered more Pascal cards to be produced, expecting that the mining craze would continue. In Q1 of 2018 NVIDIA made $289M from Cryptocurrency miners, a figure that has since dropped to 18M.

NVIDIA's CEO Jensen Huang did say "Cryptocurrency Is Here to Stay", and while technically true, mining has absolutely crashed and in recent months GPUs have started returning to normality with regards to pricing, but what is NVIDIA to do with all this leftover stock?

- They could just keep the next architecture (now known as Turing) under wraps until stock depletes, but this could take months; and Pascal was already an abnormally long GPU life-cycle. Waiting for Pascal stock to deplete could also give AMD a chance to make a come-back with Navi, which is based on TSMC 7nm and expected sometime in 2019.

- They could discount them, but that would mean lower margins and for most consumers, if they suddenly saw GTX 1070s, GTX 1080s and GTX 1080 Ti cards suddenly being discounted, they may assume that a new lineup of cards is imminent and hold off buying; thus negating the discount as a means to drive sales

So what is NVIDIA to do...

Well, they announce the new RTX Turing/20 Series cards, they price them ridiculously high and thus make the Pascal cards suddenly look like a good deal. It's quite genius really because NVIDIA can't lose, sure they take some bad publicity but anyone looking to get a high end GPU either has to pony up for the RTX cards, or buy Pascal cards, many of which are still only just returning to MSRP - Turing has made cards based on a two plus year old architecture, selling at MSRP look like a good deal.

NVIDIA is either selling RTX cards for a exorbitant amount, or selling older cards at (or near) MSRP, shifting old stock and maintaining high profit margins; and it seems to be working, many on Reddit, Twitter and Facebook seem to be buying up Pascal/10 Series cards after finding out how much the RTX cards will cost -- and when the inevitable price cut comes (once Pascal stock has depleted), NVIDIA will re-coup any bad publicity.

I also expect sometime in 2H 2019 / 1H 2020, we'll see a 7nm refresh of Turing, or a brand new architecture altogether, but that's an story for a different article.

So, if you want cheaper Turing (20 series) cards, just buy a GTX 10 series card, they're worth the money.

Thanks for reading.

My Twitter

Comments