Home / Component / Graphics / Nvidia: Sales of high-end GeForce GTX GPUs are increasing

Nvidia: Sales of high-end GeForce GTX GPUs are increasing

It is not a secret that sales of discrete graphics processing units for personal computers are decreasing. In the second quarter of the year shipments of graphics cards for desktop computers dropped to the minimum level in many years. Despite of this, shipments of graphics cards used by gamers are increasing, according to Jeff Fisher, senior vice president of GeForce business unit at Nvidia Corp.

One of the key reasons why sales of low-end graphics processing units are getting lower is increasing performance of graphics processors integrated into modern central processing units. Ten years ago, integrated graphics processors were incapable of delivering decent performance even in casual games, but today they offer fine frame-rates in a lot of games. As a result, people no longer need low-end graphics cards for desktop computers.

However, more and more people are buying GeForce GTX GPUs to play video games, according to Jeff Fisher, who spoke with Japanese 4Gamer.net web-site. Mr. Fisher claims that new-generation game consoles do not threat sales of discrete graphics adapters, but actually help to popularize games and GeForce GPUs.

nvidia_geforce_gtx_titan_x

Several years ago, Nvidia’s top-of-the-range gaming graphics adapters cost $499. The recommended price of the GeForce GTX 980 was $549, the GeForce GTX 980 Ti costs $649 and the GeForce GTX Titan X is priced at $999. Mr. Fisher denies that Nvidia’s graphics processors are getting more expensive with each generation and claims that Titan series is not aimed at gamers, but enthusiasts of high-performance computing.

Discuss on our Facebook page, HERE.

KitGuru Says: Nvidia has managed to capture over 80 per cent of the world’s GPU market, which is why the company can enjoy increasing sales of its products. By contrast, its arch-rival AMD is struggling to increase sales of its Radeon R9 graphics cards for gamers.

Become a Patron!

Check Also

The Nvidia RTX TITAN GPU will reportedly return with Blackwell

It has been years since we last saw a TITAN class GPU from Nvidia. However, it looks like it will be making a return with the upcoming Blackwell architecture, which is expected to power the next-generation RTX 50 graphics cards. 

24 comments

  1. their main consumers (gamers) only get so much christmas money for the most part.

    Can’t get blood from a stone.

  2. “arch-rival AMD is struggling to increase sales of its Radeon R9 graphics cards for gamers.” – They’ve an image problem and that’s down to not beating the competition. Whenever Radeon has sold well its because it was just better. Nvidia high end in 2015 are just better. Only a small percentage of people care about the AMD GPU offerings argument that longevity reigns.

    Everyone updates in a year or two and if AMD aren’t coming to the table with something that beats nvidia.. why should we buy?

  3. I’m a Gamer and have 2x Titan X in SLI. I know plenty more people in the same boat as myself, who are Gamers as well. Just because we are Gamers doesn’t mean we are living with Mommy and Daddy depending on someone else for cash. In contrast most of us who build enthusiast type PCs are Gamers who work enough to afford these things ourselves. I think people tend to forget or deny the fact that a lot of Gamers today were also Gamers in the 80’s and 90’s growing up. Just, now a lot of us have decent jobs and have made a good living and can afford these things. It’s no different than fixing up a car, motorcycle or dumping cash into any other hobby for that matter. But Stereo-typing lives on I see.

  4. Here’s the problem: GTX950 was just released; granted four times slower than Titan-X, it has the same GTX prefix so we can’t tell them all apart. Too many naive enthusiasts will wrong attribute increased GTX sales to their favorites, they never think about the historical statistic that low-end always sells more SKUs than high-end.

    We need to see the demographics breakdown (how many sold of each card and their selling rates).

  5. So agree to this as i am in the same boat xD

  6. When I bought my gtx 970 last summer, I had no choice but get a 970 rather than a 980 because top end cards cost so much these days. I can’t justify £500-£600 for just a GPU that will give me 10-15% more fps than the next lower model. For some reason during this long drawn out recession, GPU prices only seem to have gone up in price.

  7. Same boat, i’m approaching 40 and even as a child never got anything gaming related for xmas or birthdays. It’s always been about how much I can shave off my pay check at the end of the month.

  8. HBM in 2015, what does it get AMD:

    -A competitive graphics card
    -Publicity, they were first to market, HBM will be associated with AMD from now on
    -Expertise, if AMD have any engineers worth their salt they will have learnt a great deal from the design and production of the Fury series and integrating HBM into the GCN architecture. This should give them a leg up in producing their HBM2 product. Handy when you’re David facing down nVidia’s Goliath.

  9. this is why we can’t have nice things..

  10. I have a single Titan X and I’m in my late 20’s. Work full time to pay for my performance-PC hobby. My custom water loop actually ended up costing more than the Titan X. Was thinking about adding another Titan but since I’m playing at 1440p it just doesn’t make a lot of sense to drop another grand for SLI. This is especially true because Pascal and Arctic Islands are coming out next year which I plan on putting in SLI 4k 120hz (DP 1.3 willing)

  11. We needed 2 GTX Titan X to play the Luminous Engine Witch Chapter 0 (i think they used 4 just for fun)
    But with a GTX Titan P (Pascal anyone?) and HBM2 there might be enough power to play it.
    And here comes the challange, The nextgen of consoles should be at least as fast as this GPU.

  12. this is the most disppointing news i read as PC gamer since months.
    now middle end grafic card will cost players $ 500 .
    Why do players encourage this ??

  13. the GTX950 costs $240 here in Canada while the GTX960 costs $230…..makes no sense. The GTX950 should be at least $40 cheaper so i expected $180-$190. I don’t know why its more expensive than their better card.

  14. Middle end is GTX970 at $330. What do you mean $500? The GTX980 is a top of the line card for $499 and their entry level is GTX960 for $199.

  15. 980 isn’t top of the line… that’s titan x/gtx 980ti. gtx 980 is a mid-range card. Which is why its barely better than a last gen top of the line card such as the 780ti and r9 290x.

  16. The price of the GTX950 is exactly why I went with GTX960. 950 cost like $160 at launch, I did some shopping around and got an out-of-the-box overclocked 960 with aftermarket cooler for the exact same money (after rebate). Based on the gaming so far I think what I paid for the 960 was just about right, anything much higher and I think I would’ve felt it was overpriced.

  17. Pascal is one reason that I do not see the need to buy a Titan or 980 or even 970. The next gen GPUs are going to have HBM etc so that is going to be quite a leap in performance. I would hate to throw down $500-1000 for GPUs today when something noticeably better is right around the corner.

  18. Yeah, HBM AND 16nm finfet plus. Double the Transistors.
    I’m also waiting for both, amd and nvidia to show what they have.

  19. nVidia is not a Goliath. AMD’s Joe Macri and his team developed DDR5 and HBM memory:

    http://www.pcworld.com/article/2922599/amd-talks-up-high-bandwidth-memory-that-will-power-its-next-gpus-pokes-nvidia-too.html

    AMD is one of the legendary ancients of Silicon Valley, as old as Intel. nVidia is just an also-ran company.

  20. what… no

    mid rage are the GTX x50 ti with boost, GTX x60, High end are the x70 and x80,

    over the top bragging rights are the x80 ti and titians..

    the Normal x70 and x80 have always been consider the high end and any thing above it like the x85 x90, titains (dual GPUs) have been there for bragging rights

    this dates as far back the the GTX 260, 280 and 285… Mid, High, Extream.

  21. No,

    That’s just it. It’s not a competitive card. By the time anyone see’s its value, coincidently around about the time that 4k really takes off.
    HBM2 will be out!

    I take issue with the premise that HBM will be associated with AMD forever more because in 2016 it won’t be. So, you had AMD having this for something like a year where it doesn’t even manage to lead where it counts, even though it has it. The only success here is in the second hand market when next year everyone who might be looking at HBM for the benefit at higher resolutions won’t have to buy Greenland… Which is just AMD shooting itself in the foot. all over again. Good for consumer, bad for AMD and its bottom line.

    Nvidia have competent people and more of them. They’ll have HBM2 and it’ll be their first foray with the new standard. People will contrast how lacklustre AMD’s first attempt was and how glorious nvidias will be and they’ll see the premature release of HBM on AMD’s part for exactly what it was. Desperate.

    The incontestable fact? Even with HBM tacked on with masking tape, the architecture wasn’t the winner at the end of 28nm. We can close the book on this one and say that AMD either had no personnel/resources left to explore efficiencies or that there was none left.

    Either way, they took an expensive gamble that didn’t pay off.
    One more bloody gasp.

  22. The real problem is the prices in Canada…

    I can easily find GTX970 @ 289,99$ in the US and the same f**king card is like 449,99$ in canada…

    They are taking Canadians for dumbass for sure… BTW the change rate is NOT 55%…

  23. The fact that Fury X is sold out anywhere should tell you that plenty of people saw their value.

    (Mostly miners, though)

  24. Well derp, when you only have 960 —– 970 and nothing in between .. Derp. 960 is a horrible card for the price and a 970 is just too expensive. Where does one go from there? AMD if power consumption is something you dont worry about.