Home / Tech News / Featured Announcement / New leak points to three RX Vega cards- Vega XTX, XT and XL

New leak points to three RX Vega cards- Vega XTX, XT and XL

AMD is just a few weeks away from finally launching the RX Vega but it looks like there will be more than one card on the way, with reports this week indicating that there will be three different SKUs. Right now, there are supposedly three Vega cards in the works, codenamed Vega XT, Vega XTX and Vega XL.

Apparently we can expect the RX Vega XTX to be the top of the pile, featuring 4096 shader units, 8GB of HBM2 and 375W board power. This particular model features an all-in-one liquid cooler, similarly to the Fury X. From there, the RX Vega XT features the same core specs but will feature an air cooler instead while using 285W board power. The RX Vega XL has an unconfirmed amount of HBM2 memory but is said to feature 3584 shader units instead. This one will also feature an air cooler and use 285W of power.

The initial report here came from a source speaking with the German site, 3DCenter, which reported the three Vega codenames, alongside shader units and TDPs. Videocardz then spoke with its own sources to confirm the information, adding that a version of the RX Vega with 16GB of HBM2 hasn't been ruled out just yet, though in my opinion it seems unlikely based on the current information we have.

AMD's official RX Vega unveiling is still set to take place at SIGGRAPH on the 30th of July. The latest performance leaks seem to indicate that at least one RX Vega GPU will pull ahead of a GTX 1080 but won't match the GTX 1080Ti.

KitGuru Says: With the RX Vega launch so close, we are bound to see a few leaks during the build up. Hopefully once all is said and done, Vega will meet expectations.

Become a Patron!

Check Also

Bethesda plans to make annual Starfield expansions for “many years to come”

After swirling around in Todd Howard’s head for 25 years, we finally saw the release of Bethesda Game Studios’ Starfield last year. Though the game’s reception was far more divisive than most of the previous titles from the studio, the team appear dedicated to Starfield, with Howard himself claiming that they are planning to release annual expansions.

7 comments

  1. 375w?
    That doesn’t sound good it would have to be a fair bit better then a 1080ti for that power usage
    And I don’t see that happening

  2. Compete irrelevant comparison. Different chips from different companies.

  3. GrimmReaper WithaSpoon

    My guess is, with rasterization enabled and release drivers, the watercooler GPU will be 10-15% worse than the 1080 ti, as it will be an OCed vega we’ve seen so far so about 1650-1700 MHz?

  4. The issue with AMD’s rasterisation tech, which is very good don’t get me wrong, is that it only reaches it’s full potential is well made DX12 titles. And there is no game on the market that MS considers ‘good’ for DX12. They said TimeSpy was good, but that’s it. They used Deus Ex Mankind Divided’s DX12 mode for the demo. But that was a white lie. The (awful) DX12 mode in MD right now wouldn’t actually be able to use it, and the demo version was thrown together as a proof of concept. I would like to clarify, I have no issue with AMD using this white lie this time, as it wasn’t really for marketing. Now consider that NVidia is rapidly making progress on the same tech, further exampled by their recent WHQL driver release boosting many DX12 features, including rasterisation mapping to tier 3 for Pascal GPUs, and it’s not that great. Limited as it is by DX12, like all new APIs, taking a very long time to mature.

    And drivers? Don’t believe the “release driver” BS. I’ve been around the GPU release block many-a-times. I work in this stuff. And every time I see it. “Just wait! the GTX 700 series will be WAY better with optimised drivers!”, “Just you wait and see, RX 400 will be WAY better with optimised drivers!”. Were they? No. Why? Because NVidia and AMD do not make new GPUs, by engineering them on a different GPUs drivers. For many reasons that should be obvious. But the one thing no one, and I mean no one, ever remembers…that GPUs are standardised. Yes, okay we need APIs. But they all use shader core layout (NVidia got all pretentious and called them “CUDA” cores because marketing), they all use near the same instructions, though AMD has intrinic functions, but that’s only for console devs, and games can only use it if they have a detection trigger and a separate, Radeon RX render pipeline. This is why the magic release drivers, never actually do a damn thing.

  5. GrimmReaper WithaSpoon

    I have no idea about the rasterizer, we’ll see when it gets released.
    Also, I don’t know where you’ve been living, but massive driver increases are real. I think it was when R9 300 series was released or something, and it was seeing terrible performance in some games, and good in some. After some time, a driver update was released that increased performance by 30-35%. AND it worked on previous gen GPUs. So let’s just wait and see. Hell, it’s 2 weeks away.

  6. I’ve tried to go over this in my head to see what it could indicate but there are too many variables, however taking the obvious mining and brexit bullshit variables out, the stores in the UK definitely have wind of something. Five weeks ago I picked up a MSI GTX1080 Gaming X Plus for £478 after a £43 cashback offer and there was similar offers from Asus and EVGA, tonight my exact card is £594.98. The GTX1080 cards I looked at including the EVGA hybrid all seem to have risen in the region of £40-80. I hope this isn’t an indication of a similar performing AND similar priced Vega card, that would be a disaster.

  7. has nothing to do with performance and all to do with the fact that the newest nvidia drivers have greatly increased their mining capability. AMD cards have been selling 50% over retail for 2 months or so now, looks like nvidia cards are now being utilized by the miners as well.