SLi | KitGuru https://www.kitguru.net KitGuru.net - Tech News | Hardware News | Hardware Reviews | IOS | Mobile | Gaming | Graphics Cards Thu, 30 Mar 2023 14:14:45 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.3 https://www.kitguru.net/wp-content/uploads/2021/06/cropped-KITGURU-Light-Background-SQUARE2-32x32.png SLi | KitGuru https://www.kitguru.net 32 32 Nvidia releases new driver for Ampere graphics cards, SLI support moving to graphics APIs https://www.kitguru.net/components/graphic-cards/joao-silva/nvidia-releases-new-driver-for-ampere-graphics-cards-sli-support-moving-to-graphics-apis/ https://www.kitguru.net/components/graphic-cards/joao-silva/nvidia-releases-new-driver-for-ampere-graphics-cards-sli-support-moving-to-graphics-apis/#respond Fri, 18 Sep 2020 08:39:42 +0000 https://www.kitguru.net/?p=486471 Nvidia's latest graphics driver introduces support for the recently launched RTX 3080 and the upcoming RTX 3090 cards. The 456.38 WHQL graphics driver also introduces Nvidia Reflex, Nvidia Broadcast, and the typical set of optimisations for some games. Additionally, it disables implicit SLI on Ampere graphics cards, but explicit SLI will still work through graphics APIs …

The post Nvidia releases new driver for Ampere graphics cards, SLI support moving to graphics APIs first appeared on KitGuru.]]>
Nvidia's latest graphics driver introduces support for the recently launched RTX 3080 and the upcoming RTX 3090 cards. The 456.38 WHQL graphics driver also introduces Nvidia Reflex, Nvidia Broadcast, and the typical set of optimisations for some games. Additionally, it disables implicit SLI on Ampere graphics cards, but explicit SLI will still work through graphics APIs such as DirectX 12, Vulkan, and OpenGL.

The latest Nvidia graphics driver brings DLSS and RTX to Fortnite on all RTX graphics cards. Halo 3: OSDT and Mafia: Definitive Edition received in-game optimisations for improved performance and stability, but a set of features that includes an in-game performance overlay, one-click performance tuning, HDR ShadowPlay Capture, and AV1 decoding, was also added. GeForce Experience’s one-click optimal settings support 13 new games and the list of G-Sync compatible displays also received 5 new monitors: Acer XV253Q GW, Asus VG27AQ1A, LG 27GN600 and 27GN800, and ViewSonic XG270Q.

Nvidia Reflex (in-game latency reduction feature) was also added, but it will only be compatible with Fortnite and Valorant, for now. All GPUs from the GTX 900 series and beyond will support Nvidia Reflex. In mid-range cards such as the GTX 1660, users can expect a latency improvement of up to  33%. On the new RTX 3080, the in-game latency is already better than mid-range cards, but it can be further improved by up to 40%.

Another interesting feature that comes with this driver is Nvidia Broadcast, an application that “upgrades any room into a home broadcast studio” through the use of AI. Nvidia Broadcast brings three AI-powered features: Noise removal, to remove background noise, Virtual Background, allowing to change your background without the need of a green screen, and Auto Frame, which tracks your head movements to follow them.

The changelog of the new driver also states that the new Ampere graphics cards won't support implicit SLI. Only explicit SLI will be supported, and it will be handled by APIs like DirectX 12, Vulkan, and Open CL. This was expected given that the only RTX 30 series card featuring the NVLink interface is the RTX 3090 graphics card.

Anyone that buys selected GeForce RTX 30 Series GPUs or selected pre-built PCs with RTX 30 series GPUs will receive a bundle including a digital copy of Watch Dogs: Legion and 1-year membership for GeForce NOW.

Discuss on our Facebook page, HERE.

KitGuru says: Have you already downloaded the new Nvidia graphics driver? What do you think about Nvidia Reflex and Broadcast? How do you feel about the changes to SLI?

The post Nvidia releases new driver for Ampere graphics cards, SLI support moving to graphics APIs first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/joao-silva/nvidia-releases-new-driver-for-ampere-graphics-cards-sli-support-moving-to-graphics-apis/feed/ 0
Nvidia is developing a new multi-GPU tiled rendering technique for Turing cards https://www.kitguru.net/components/graphic-cards/joao-silva/nvidia-is-developing-a-new-multi-gpu-tiled-rendering-technique-for-turing-cards/ https://www.kitguru.net/components/graphic-cards/joao-silva/nvidia-is-developing-a-new-multi-gpu-tiled-rendering-technique-for-turing-cards/#respond Fri, 22 Nov 2019 08:21:22 +0000 https://www.kitguru.net/?p=437455 Tile-based rendering isn't new to NVIDIA GPUs, but tile-based rendering in multi-GPU systems is another thing- and it looks like the building blocks are already secretly in place in Nvidia's drivers.

The post Nvidia is developing a new multi-GPU tiled rendering technique for Turing cards first appeared on KitGuru.]]>
Tile-based rendering isn't new to NVIDIA GPUs, but tile-based rendering in multi-GPU systems is another thing. It seems this rendering technique is already in place in Nvidia's driver, but it doesn't look like there's any way for developers to implement it in their games just yet.

A user from 3DCenter.org‘s forum, named Blaire, has found evidence in Nvidia's GPU drivers that a MultiGPU rendering mode has been added. This rendering technique is named CFR, which could be short for “checkered frame rendering” or “checkerboard frame rendering” according to TechPowerUp.

As with any tiled rendering method, a frame is divided into smaller squares called tiles, similar to a checkerboard. In CFR's case, the tiles from the frame should be shared between multiple GPUs, unlike SFR – split frame rendering – which divides a frame by two equal parts, and AFR – alternate frame rendering – in which frames are calculated alternatively.

CFR will, supposedly, offer less micro-stutter and optimized resource allocation, when compared to AFR.

The requirements for NVIDIA CFR, according to 3DCenter, are:

  • DirectX 10/11/12
  • NVLink
  • Turing-based NVIDIA graphics cards (with NVLink)

You can force CFR through nvidiaProfileInspector, but it still has a lot of compatibility issues, which is to be expected from a still unofficial feature.

KitGuru says: Checkerboard rendering techniques have been used a lot lately, and consoles use it to produce upscaled resolutions, up to 4K. What do you think of this technique? 

The post Nvidia is developing a new multi-GPU tiled rendering technique for Turing cards first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/joao-silva/nvidia-is-developing-a-new-multi-gpu-tiled-rendering-technique-for-turing-cards/feed/ 0
Nvidia won’t be supporting NVLink SLI on the GeForce RTX 2070 https://www.kitguru.net/components/graphic-cards/damien-cox/nvidia-wont-be-supporting-nvlink-sli-on-the-geforce-rtx-2070/ https://www.kitguru.net/components/graphic-cards/damien-cox/nvidia-wont-be-supporting-nvlink-sli-on-the-geforce-rtx-2070/#respond Wed, 22 Aug 2018 16:01:37 +0000 https://www.kitguru.net/?p=384234 Nvidia announced its brand new RTX 20 series of GPUs earlier this week with many eyes drawn to the ‘cheaper' GeForce RTX 2070 launching in October. Sadly, it looks as though the RTX 2070 will be following in the footsteps as the GTX 1060, with Nvidia seemingly removing the option to SLI more than one …

The post Nvidia won’t be supporting NVLink SLI on the GeForce RTX 2070 first appeared on KitGuru.]]>
Nvidia announced its brand new RTX 20 series of GPUs earlier this week with many eyes drawn to the ‘cheaper' GeForce RTX 2070 launching in October. Sadly, it looks as though the RTX 2070 will be following in the footsteps as the GTX 1060, with Nvidia seemingly removing the option to SLI more than one card with its NVLink bridge.

NVLink compatibility wasn’t a point of discussion at Nvidia’s GeForce celebration, prompting sleuths across the internet to dig a little deeper into what we can expect from the new real-time ray tracing capable graphics cards. Out of the three GPUs announced at the event, only the GeForce RTX 2080 and the flagship RTX 2080 Ti product listings house a dedicated NVLink tab on Nvidia’s website, while the RTX 2070 does not.

 

Image: GeForce RTX 2080 Ti (right) and GeForce RTX 2070 (left)

“The GeForce RTX NVLink bridge connects two NVLink SLI-ready graphics cards with 50X the transfer bandwidth of previous technologies,” reads the site’s description. “This means you can count on super-smooth gameplay at maximum resolutions with ultimate visual fidelity in GeForce RTX 2080 Ti and 2080 graphics cards.”

The nail in the coffin is seen within the images of Nvidia’s reference cards, with the GeForce RTX 2080 and 2080 Ti housing a bump in the upper left corner, while the RTX 2070 remains smooth all the way around. Although Gigabyte has a single infographic on its website that hints that the NVLink connector could be coming to secondary market 2070 GPUs, it seems unlikely given that no vendor has discussed the technology and has remained vigilant on the angles of its product photos. Some, such as MSI and ASUS, have not even released product listings for the mid-range card.

Image: Gigabyte

Although Nvidia hasn’t commented on the matter and is unlikely to in the future, it’s speculated that the feature has been removed as it would be a more cost effective solution resulting in a similar or even higher performance than buying the more expensive RTX 2080 Ti. Many believe that the same reasoning can be attributed to Nvidia’s culling of SLI support in the GTX 1060, which makes this move much less of a surprise and more of a frustration.

Nvidia is continuing to offer 2-way NVLink bridges after moving away from native 3-way and 4-way SLI support with Pascal. These start at $79.99 with a 60.96mm 3-slot option and an 81.26mm 4-slot version, limited to 1 per customer. It’s possible that this is just advisory like it was with the GTX 10 Series, but that’s difficult to tell without a hands-on with the new graphics cards.

KitGuru Says: Given that Nvidia has effectively removed the option to add another GPU at a later date, I can’t say I’m the biggest fan. It forces people to spend a great deal more right off the bat or wait a considerable amount of time until the GPUs become more affordable. How do you feel about Nvidia’s treatment of SLI with the RTX 20 Series?

The post Nvidia won’t be supporting NVLink SLI on the GeForce RTX 2070 first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/damien-cox/nvidia-wont-be-supporting-nvlink-sli-on-the-geforce-rtx-2070/feed/ 0
Intel Core i9-8950HK spotted again, this time inside of an MSI laptop with dual GPUs https://www.kitguru.net/lifestyle/mobile/laptops/matthew-wilson/intel-core-i9-8950hk-spotted-again-this-time-inside-of-an-msi-laptop-with-dual-gpus/ https://www.kitguru.net/lifestyle/mobile/laptops/matthew-wilson/intel-core-i9-8950hk-spotted-again-this-time-inside-of-an-msi-laptop-with-dual-gpus/#respond Tue, 13 Feb 2018 18:55:57 +0000 https://www.kitguru.net/?p=364093 Back in November, an AIDA64 update unveiled some new upcoming Intel processors, the most interesting of which was the Core i9-8950HK. This is set to be the first Core i9 processor for laptops and will apparently be based on the Coffee Lake architecture. This week, this particular CPU broke cover again, indicating that Core i9 …

The post Intel Core i9-8950HK spotted again, this time inside of an MSI laptop with dual GPUs first appeared on KitGuru.]]>
Back in November, an AIDA64 update unveiled some new upcoming Intel processors, the most interesting of which was the Core i9-8950HK. This is set to be the first Core i9 processor for laptops and will apparently be based on the Coffee Lake architecture. This week, this particular CPU broke cover again, indicating that Core i9 laptops will be on the way soon.

The latest leak comes from 3DMark, where a new MSI laptop was tested, featuring two GTX 1080 GPUs and the Core i9-8950HK processor. As the AIDA64 leak suggested, the CPU does contain six cores and twelve threads, it will also supposedly be unlocked, opening it up for user overclocking.

According to the 3DMark leak, which was screenshotted by Videocardz, the Core i9-8950HK has a 2.9GHz base clock and a 3.9GHz boost. That is unfortunately all of the information we have so far, but it seems that laptop makers do in-fact have the chip in their hands and are currently developing new models equipped with it.

As for MSI's mystery laptop, it is set to be a power house spec-wise, with dual GPUs, 32GB of RAM and dual SSDs. Hopefully we'll hear more about this soon.

KitGuru Says: With this being a Core i9 CPU, I wouldn't expect to see it in a laptop under the £2000 to £2500 mark. It is almost certainly going to be reserved for the beefiest of them. Are any of you thinking about going for a gaming laptop? How much would you be willing to spend for true desktop replacement performance?

The post Intel Core i9-8950HK spotted again, this time inside of an MSI laptop with dual GPUs first appeared on KitGuru.]]>
https://www.kitguru.net/lifestyle/mobile/laptops/matthew-wilson/intel-core-i9-8950hk-spotted-again-this-time-inside-of-an-msi-laptop-with-dual-gpus/feed/ 0
Acer Predator 21 X Laptop (21-inch curved IPS screen, 120Hz, GTX 1080 SLI) https://www.kitguru.net/lifestyle/mobile/laptops/leo-waldock/acer-predator-21-x-laptop-21-inch-curved-ips-screen-120hz-gtx-1080-sli/ https://www.kitguru.net/lifestyle/mobile/laptops/leo-waldock/acer-predator-21-x-laptop-21-inch-curved-ips-screen-120hz-gtx-1080-sli/#comments Tue, 11 Jul 2017 08:11:58 +0000 https://www.kitguru.net/?p=340122 The Acer Predator 21 X is a huge laptop that is packed with top notch gaming hardware. The headline feature is the curved IPS panel with a refresh rate of 120Hz that sports G-Sync technology. There is minor controversy about the size of the screen as the specification says 21-inches and our tape measure suggests …

The post Acer Predator 21 X Laptop (21-inch curved IPS screen, 120Hz, GTX 1080 SLI) first appeared on KitGuru.]]>
The Acer Predator 21 X is a huge laptop that is packed with top notch gaming hardware. The headline feature is the curved IPS panel with a refresh rate of 120Hz that sports G-Sync technology. There is minor controversy about the size of the screen as the specification says 21-inches and our tape measure suggests 21.5-inches but what the heck, it’s a mere detail (and measuring a curved screen is an inexact science anyway.) The rest of the specification includes a mobile Core i7, dual GTX 1080 GPUs in SLI and a bunch of DDR4 memory along with a pair of SSDs in RAID 0. It’s a gaming beast of a laptop but that knowledge does not prepare you for the size and weight of the Predator 21 X.

Put it this way, catching a falling Acer Predator 21 X is a nasty shock to the system.

Acer Predator 21 X Specification:
CPU: Intel Core i7-7820HK (2.9GHz-3.9GHz)
Display: Curved 21-inch 2,560×1,080 IPS, 120Hz, G-Sync
System Memory: 64GB DDR4-2400MHz
Graphics: Dual Nvidia GeForce GTX 1080 GDDR5X 8GB in SLI
Storage: 2x 512GB NVMe PCIe Toshiba XG3 SSDs in RAID 0 gives a 1TB array plus 1TB 7,200rpm HDD

I/O Ports:
1x HDMI
2x DisplayPort
4x USB 3.0 (Type A)
1x Thunderbolt 3 (Type C)
1x Gigabit Ethernet
Headset Jacks
DC Jack
Audio: Four speakers
Wireless LAN: Killer 1535 (802.11ac, a/b/g/n compatible)
Battery: Li Ion 8-cell Li-Ion 88Wh/6000mAh
Power adapters: Dual 330W
Security: Kensington Lock
Dimensions: 568mm (W) x 315mm (D) x 69mm-83mm (H)
Weight: 8.5kg
OS: Windows 10 Home

Acer tells us they will only make 300 units of the Predator 21 X worldwide, however we got the distinct impression our review sample was something of a ‘reviewer’s special' that fell outside of the norm, and not in a particularly good way. The memory had been cut from 64GB to 32GB and the SSD RAID array was a mere 512GB instead of the full 1TB.

We are confident these changes didn’t affect performance but still, you know, it would have been good to see the full blooded Predator 21 X in action. Also, the top panel that covers the right hand cooling fan was a funky custom part and did not carry an official Acer number.

We mentioned most of the features of the Predator 21 X in our video but one slipped by which is the Tobii eye tracking software that is connected to the webcam. This software can be used in a handful of games to control your character and the way they interact with the game, for example by looking at a group of enemies to switch the aiming point of your weapon.

Truth be told your reviewer found it somewhat ominous to see the activity LED responding to movement within the room. This laptop is watching you, in much the same way your phone is listening to you, and that isn’t a great feeling. If you want to see this in action then watch our full review of the Tobii 4C Eye Tracker over HERE.

In our video we ask a question (to which we do not have an answer) as to why Acer has used a mobile Core i7 processor. We have observed in the past in a number of high end laptops that a mobile CPU hurts performance and that you get more performance with a desktop CPU. Clearly a mobile CPU has a part to play in a laptop where there is limited cooling but that does not seem to apply to the Acer Predator 21 X. The chassis is so large and chunky you would think Acer could install any motherboard and CPU combination they desired.

Testing

The short verion of events is that the Acer Predator 21 X powers through our tests. The only downside is that we used a screen resolution of 1,920 x 1,080 to obtain benchmark results that could be compared with previous laptops. You will appreciate the native resolution of 2,560×1,080 is completely out of the ordinary.

Cooling Performance

Acer has designed the Predator 21 X chassis to accommodate a massive cooling system with eight heat pipes and five cooling fans. Two of the coolers sit under the keyboard while the three large units are located at the rear of the chassis where they handle the CPU and two GPUs.

In our stress tests the CPU temperature sat at 90 degrees and the GPUs were in the mid-80s. Gaming temperatures were 5-7 degrees cooler than those stress test figures.

Acoustics performance

Those temperatures may sound a bit high but they were achieved with the fans running at very low speeds. The starting point on the fan curve is 500rpm, increasing to 1,000rpm in most circumstances. If you pull the pin and manually switch the fans to full speed they run at 4,200rpm which is exactly as noisy as you would expect.

The point here is that Acer has an enormous amount of headroom with the cooling but has selected fan speeds that control the cooling in this epic gaming laptop with the minimum of noise.

Battery life

We were stunned to see the battery will keep the Acer Predator 21 X running in PC Mark 8 for one hour 43 minutes, which you can double to three and a half hours in the real world. The battery is a decent size at 88Wh/6600mAh but not massively large so this is testament to the power saving technology in Intel and Nvidia silicon.

It also highlights the way the GPUs throttle back massively on battery power, rendering game play a non-starter. Quite why Acer has bothered to provide a decent battery for the least portable laptop we have ever seen is a completely different question.

Closing Thoughts

A review is structured to guide the reader/viewer towards a buying decision. What are the strengths and weaknesses of the Acer Predator 21X, how does it compare with competing products and does it represent good value for money? The snag here is that the Acer Predator 21X is a uniquely large laptop with a massive curved screen and it weighs so much that many people would dispute it deserves to be called a laptop in the first place.

Added to that, the price is so high at £9,000 that it seems ridiculous to discuss value for money. For the same sum of cash you can buy two MSI Titans or two and a half Razer Blade Pro laptops.

If we simply look at the performance of the Acer Predator 21 X we don’t move much closer to a buying decision as the combination of Core i7 and dual GTX 1080s is well known. There are considerably cheaper ways to achieve these frame rates, but again that is hardly the point.

So really it boils down to whether you want a laptop with a curved 21-inch screen and outrageous 2,560×1,080 resolution with 120Hz refresh rate. Any laptop that can accommodate such a screen would need to be very large and when you add in the grunty graphics and hefty cooling system you end up with a laptop that is both huge and heavy.

Once you work through those questions you may need to check your bank account for the necessary £9,000 and that should pretty much deliver the answer. So yes, you might like the idea of owning an Acer Predator 21 X but no, you’re not very likely to make the leap and actually purchase this mighty laptop. For the 300 people who do – we can only be envious!

Price £8,999.99 inc VAT, USD $9,000 or EUR10,000 – on sale in October. More info over HERE.

Buy from the ACER STORE HERE.

Pros:

  • Huge 21-inch screen with ultra wide resolution.
  • Dual GTX 1080s in SLI power your games along.
  • The cooling system is superb.
  • PredatorSense software controls lighting, fans and overclocking.
  • Mechanical keyboard with Cherry Brown switches is excellent.

Cons:

  • Epic cost.
  • Massively heavy.
  • The chassis is huge.
  • By any sensible criteria this isn’t really a laptop.

KitGuru says: The most bonkers laptop in the world. Superb to use but incredibly expensive.

Be sure to check out our sponsors store EKWB here

The post Acer Predator 21 X Laptop (21-inch curved IPS screen, 120Hz, GTX 1080 SLI) first appeared on KitGuru.]]>
https://www.kitguru.net/lifestyle/mobile/laptops/leo-waldock/acer-predator-21-x-laptop-21-inch-curved-ips-screen-120hz-gtx-1080-sli/feed/ 5
The Vulkan API does support multi-GPU, even on older versions of Windows https://www.kitguru.net/components/graphic-cards/matthew-wilson/the-vulkan-api-does-support-multi-gpu-even-on-older-versions-of-windows/ https://www.kitguru.net/components/graphic-cards/matthew-wilson/the-vulkan-api-does-support-multi-gpu-even-on-older-versions-of-windows/#comments Thu, 23 Mar 2017 10:15:11 +0000 http://www.kitguru.net/?p=326523 Over the last week or so, it has been widely reported that the Vulkan API would not support multi GPU setups on older versions of Windows. This topic came to a head when the studio behind Star Citizen announced plans to drop DirectX 12 in favour of Vulkan due to its multi-OS support. It turns …

The post The Vulkan API does support multi-GPU, even on older versions of Windows first appeared on KitGuru.]]>
Over the last week or so, it has been widely reported that the Vulkan API would not support multi GPU setups on older versions of Windows. This topic came to a head when the studio behind Star Citizen announced plans to drop DirectX 12 in favour of Vulkan due to its multi-OS support. It turns out, those reports were inaccurate, as the Kronos Group has come out this week to explain that the Vulkan API will support SLI/Crossfire on Windows 7/8.1 as well as Windows 10 and Linux.

The question of multi GPU support in Vulkan first popped up at GDC earlier this month. Since then, it has been reported that SLI/Crossfire functionality would be tied to Windows 10 but that is not the case. In an updated developer blog, Khronos clarified the situation: “The good news is that the Vulkan multi-GPU specification is very definitely NOT tied to Windows 10. It is possible to implement the Vulkan multi-GPU extension on any desktop OS including Windows 7, 8.X and 10 and Linux.”

There was cause for confusion though. At GDC this year. some Khronos presentations mentioned that Vulkan multi GPU functionality required the Windows Display Driver Model to be in Linked Display Adapter mode. This led to the assumption that multi-GPU setups wouldn't work with Vulkan on Windows 7 or Windows 8.1 but that appears to not be the case.

The use of WDDM is required for Windows operating systems to take advantage of the Vulkan multi GPU extension. However, LDA mode is not explicitly required, it can just make the implementation easier. If a developer does decide to use LDA mode, it is not specifically tied to Windows 10 and can be used on other versions of the OS, including Windows 7 and 8. According to Khronos Group, there are already developers planning to ship games with this level of multi-GPU support included.

KitGuru Says: After talking about Star Citizen's switch from DX12 to Vulkan, we received a ton of comments complaining about the lack of multi-GPU support on older versions of Windows. Fortunately, it looks like the situation was misconstrued this time around, so those sticking to Windows 7 or Windows 8.1 will still benefit from their SLI or Crossfire setups in Vulkan games. Well, as long as developers choose to use Vulkan's multi GPU extension. 

The post The Vulkan API does support multi-GPU, even on older versions of Windows first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/matthew-wilson/the-vulkan-api-does-support-multi-gpu-even-on-older-versions-of-windows/feed/ 3
MSI pushes laptop gaming with high end GTX 10 series notebooks https://www.kitguru.net/lifestyle/mobile/notebook/jon-martindale/msi-pushes-laptop-gaming-with-high-end-gtx-10-series-notebooks/ https://www.kitguru.net/lifestyle/mobile/notebook/jon-martindale/msi-pushes-laptop-gaming-with-high-end-gtx-10-series-notebooks/#respond Thu, 18 Aug 2016 11:08:08 +0000 http://www.kitguru.net/?p=303372 Gaming on the go is set to get a lot more impressive with this generation of mobile chips from Nvidia, and MSI, like many manufacturers, is looking to take advantage of that. It has a new line of gaming notebooks, each sporting new mobile Pascal GPUs, high-end displays and powerful processors, to deliver desktop-level gaming …

The post MSI pushes laptop gaming with high end GTX 10 series notebooks first appeared on KitGuru.]]>
Gaming on the go is set to get a lot more impressive with this generation of mobile chips from Nvidia, and MSI, like many manufacturers, is looking to take advantage of that. It has a new line of gaming notebooks, each sporting new mobile Pascal GPUs, high-end displays and powerful processors, to deliver desktop-level gaming power in a laptop package.

We first heard about these new notebooks last week, where MSI teased the use of 4K displays, high refresh rate panels, innovative cooling solutions and much more, but we didn't know what GPUs they would have then. That was because Nvidia had yet to announce its new Pascal mobile line up. Once it did so this week, it's safe to say many of us were impressed.

So what can you expect in MSI's laptops now that we know these GPUs exist? The MSI GT72VR Titan SLI can come with either a single GTX 1080, or a pair of SLI linked GTX 1070s under the hood. That's paired up with a 120HZ, 5ms display and an Intel Core i7 6820HK processor .

titanlaunch

If that's not enough for you though, there's always the MSI GT83VR Titan SLI which can pair up dual GTX 1080s under the hood, for a ridiculous level of graphical power in a single screen set up, whether desktop or laptop.

Of course that sort of set up is mostly overkill, even if you're playing the most intensive of VR games. The GT72VR and GT62VR Dominator series laptops come with either a GTX 1070 or GTX 1060, both of which should be more than capable of handling all of the latest games.

MSI also expands this Pascal equipped line up with a couple of thinner variants, as well as those designed with streaming in mind, by bundling in a Killer LAN chip.

For more information on the range, check out the product launch page here.

Discuss on our Facebook page, HERE.

KitGuru Says: That is a lot of new laptops to chew over. I can't help but feel that just a couple with configurable hardware would be easier to digest. 

The post MSI pushes laptop gaming with high end GTX 10 series notebooks first appeared on KitGuru.]]>
https://www.kitguru.net/lifestyle/mobile/notebook/jon-martindale/msi-pushes-laptop-gaming-with-high-end-gtx-10-series-notebooks/feed/ 0
Nvidia released a new ‘Game Ready’ driver for No Man’s Sky https://www.kitguru.net/components/graphic-cards/jon-martindale/nvidia-released-a-new-game-ready-driver-for-no-mans-sky/ https://www.kitguru.net/components/graphic-cards/jon-martindale/nvidia-released-a-new-game-ready-driver-for-no-mans-sky/#comments Tue, 16 Aug 2016 07:50:35 +0000 http://www.kitguru.net/?p=303148 Nvidia has released a new “Game Ready” driver for both No Man's Sky, Deus Ex: Mankind Divided, Obduction, F1 2016, and the Paragon Open Beta, giving gamers with green branded hardware, a leg up in those various games. As with every driver release though, this one comes with a few bug fixes too. Of course the …

The post Nvidia released a new ‘Game Ready’ driver for No Man’s Sky first appeared on KitGuru.]]>
Nvidia has released a new “Game Ready” driver for both No Man's Sky, Deus Ex: Mankind Divided, Obduction, F1 2016, and the Paragon Open Beta, giving gamers with green branded hardware, a leg up in those various games. As with every driver release though, this one comes with a few bug fixes too.

Of course the official name for this release isn't anything to do with any particular game. Technically it's titled the 372.54 WHQL certified driver, but the above games are what's important, because those are the experiences that are going to benefit the most from its release.

nomanssky

This release adds an SLI profile for No Man's Sky, making it possible to double down on your graphical hardware for the game. Nvidia does warn however, that if your CPU is weak, using SLI in the game could lead to bottlenecking. You have been warned.

Deus Ex: Mankind Divided, Obduction, F1 2016 and Paragon also have new profiles in this release, as well as a number of optimisations to make them perform better on Nvidia hardware.

[yframe url='http://www.youtube.com/watch?v=kd9LP6RGh5Q']

Along with improvements in individual games, this release also adds support for Pascal hardware in laptops, with mobile versions of the GTX 1080, 1070 and 1060 all now being slotted into notebooks around the world.

Bug fixes in this release include Witcher 3 SLI shadow bugs, Netflix Windows Store app stuttering in full-screen and high-GPU clock speeds no longer occur if two DisplayPorts are used.

If you'd like to download the new driver, you can grab it here.

Discuss on our Facebook page, HERE.

KitGuru Says: Keep your drivers updated guys. I'm terrible for it, but there's always extra performance to be found, especially if you're playing the latest and greatest games. 

The post Nvidia released a new ‘Game Ready’ driver for No Man’s Sky first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/jon-martindale/nvidia-released-a-new-game-ready-driver-for-no-mans-sky/feed/ 2
Nvidia ditches ‘enthusiast key’, limits 3 or 4-way SLI to specific apps https://www.kitguru.net/components/graphic-cards/matthew-wilson/nvidia-ditches-enthusiast-key-limits-3-or-4-way-sli-to-specific-apps/ https://www.kitguru.net/components/graphic-cards/matthew-wilson/nvidia-ditches-enthusiast-key-limits-3-or-4-way-sli-to-specific-apps/#comments Thu, 09 Jun 2016 18:16:00 +0000 http://www.kitguru.net/?p=295657 Back when Nvidia first announced the GTX 1080 and GTX 1070, we began hearing reports that there would no longer be support for 3-way or 4-way SLI. However, once the review embargo lifted, we learned that Nvidia had plans to introduce an ‘enthusiast key' which would unlock the ability to use more than two Pascal …

The post Nvidia ditches ‘enthusiast key’, limits 3 or 4-way SLI to specific apps first appeared on KitGuru.]]>
Back when Nvidia first announced the GTX 1080 and GTX 1070, we began hearing reports that there would no longer be support for 3-way or 4-way SLI. However, once the review embargo lifted, we learned that Nvidia had plans to introduce an ‘enthusiast key' which would unlock the ability to use more than two Pascal GPUs. Now, things have changed once again- apparently Nvidia has ditched the ‘enthusiast key' idea and will only enable 3-way or 4-way SLI for certain applications, like benchmark tools.

This is according to a report from PCPer, who got in touch with Nvidia to find out what was going on with the enthusiast key solution. It turns out, Nvidia no longer thinks that's a great idea and will instead allow future ‘Game Ready' drivers to take advantage of 3-way and 4-way SLI in specific applications, like overclocking benchmarks.

nvidia_battlebox_4K_uhd_sli_geforce_gtx_titan1

In a statement, Nvidia said: “With the GeForce 10-series we’re investing heavily in 2-way SLI with our new High Bandwidth bridge (which doubles the SLI bandwidth for faster, smoother gaming at ultra-high resolutions and refresh rates) and NVIDIA Game Ready Driver SLI profiles. To ensure the best possible gaming experience on our GeForce 10-series GPUs, we’re focusing our efforts on 2-way SLI only and will continue to include 2-way SLI profiles in our Game Ready Drivers.”

However, if a game developer specifically wants to support more than two GPUs then Nvidia is open to working with them to do that: “Some developers may also decide to support more than 2 GPUs in their games. We continue to work with all developers creating games and VR applications that take advantage of 2 or more GPUs”.

For the most part though, 3-way and 4-way SLI support will be limited to benchmarks for the overclocking crowd: “For our overclocking community, our Game Ready Drivers will also include SLI profiles for 3- and 4-way configurations for specific OC applications only, including Fire Strike, Unigine and Catzilla.”

KitGuru Says: 3-way and 4-way SLI support has never been great in games so perhaps simply focusing on 2-way SLI support will be a better use of resources. However, it is a little odd to see Nvidia changing its plans just a few weeks after initially announcing its plans for an ‘enthusiast key'. 

The post Nvidia ditches ‘enthusiast key’, limits 3 or 4-way SLI to specific apps first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/matthew-wilson/nvidia-ditches-enthusiast-key-limits-3-or-4-way-sli-to-specific-apps/feed/ 6
DOOM will be getting SLI support soon https://www.kitguru.net/components/graphic-cards/matthew-wilson/doom-will-be-getting-sli-support-soon/ https://www.kitguru.net/components/graphic-cards/matthew-wilson/doom-will-be-getting-sli-support-soon/#comments Thu, 19 May 2016 20:00:24 +0000 http://www.kitguru.net/?p=293345 Just as the Vulkan API isn't currently implemented into DOOM (2016) in the public build, SLI support is also missing at the moment. This is unfortunate for those with high-end multi-GPU PCs built to power the latest titles at 4K/UHD resolutions but support is on the way, according to one of id Software's developers. Writing …

The post DOOM will be getting SLI support soon first appeared on KitGuru.]]>
Just as the Vulkan API isn't currently implemented into DOOM (2016) in the public build, SLI support is also missing at the moment. This is unfortunate for those with high-end multi-GPU PCs built to power the latest titles at 4K/UHD resolutions but support is on the way, according to one of id Software's developers.

Writing over on Twitter, Tiago Sousa, id Software's Lead Programmer behind the id Tech 6 Engine and DOOM addressed calls for SLI support and explained why it isn't in the game right now. It turns out that there is an issue with Shadows limiting scaling of multiple graphics cards in SLI.

doom-2016-xbox-one

There has been talk of the id Tech 6 engine just not supporting SLI at all but that doesn't appear to be the case here. We don't know exactly when the SLI update is coming though, nor do we know if this also applies to AMD's CrossFire.

In the mean time, if you're stuck with using one card, then you can check out our DOOM analysis and benchmarks, which went up earlier this week. However, do note that while AMD's R9 390 and R9 290 performance was poor at launch, a driver came out yesterday aiming to fix performance for these cards specifically.

KitGuru Says: DOOM is an excellent game but it does seem that quite a few are waiting for SLI support to enjoy it at higher resolutions. We have reached out to Nvidia to see if they can give us any idea on when we may see SLI support added to the game but we haven't heard back yet so for now, just keep an eye out for any patches as support is definitely on the way. 

The post DOOM will be getting SLI support soon first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/matthew-wilson/doom-will-be-getting-sli-support-soon/feed/ 1
Forget three, four card SLI with Nvidia GTX 10 series GPUs https://www.kitguru.net/components/graphic-cards/jon-martindale/forget-three-four-card-sli-with-nvidia-gtx-10-series-gpus/ https://www.kitguru.net/components/graphic-cards/jon-martindale/forget-three-four-card-sli-with-nvidia-gtx-10-series-gpus/#comments Thu, 12 May 2016 11:36:51 +0000 http://www.kitguru.net/?p=292499 Although the market for those running more than two graphics cards in SLI was always rather small, there's still likely to be a few people disappointed that native support for three and four card set ups will not be supported by the Nvidia 10 series GPUs. Of course when performance is as high as Nvidia …

The post Forget three, four card SLI with Nvidia GTX 10 series GPUs first appeared on KitGuru.]]>
Although the market for those running more than two graphics cards in SLI was always rather small, there's still likely to be a few people disappointed that native support for three and four card set ups will not be supported by the Nvidia 10 series GPUs.

Of course when performance is as high as Nvidia claims with the new-generation of cards, it shouldn't matter too much. Purportedly a single GTX 1080 will be able to beat out two 980s in SLI and even the monstrous Titan X. It might prove problematic for those looking to break 3Dmark records and big system sellers though.

slibridge

The new SLI bridges are certainly optimised for two cards. Sourced: VideoCardz

This news actually came out of Nvidia's conference last week but most people seemed to miss it. Indeed it was TechofTomorrow on Youtube who broke the news, highlighting how in his experience this shouldn't make much of a difference, as three and four cards set ups were not good value for money anyway.

[yframe url='http://www.youtube.com/watch?v=uWvmt9wk0n4&feature=youtu.be&t=7m7s']

Jump to seven minutes in to hear the particular announcement.

The fact that “native” support isn't planned for the 10 series drivers doesn't mean it won't be possible with a few tweaks, or even straight out of the box. It just means that Nvidia won't be going out of its way to fix any problems it discovers with three and four card set ups. It seems likely that 3+ card scaling won't be great either without optimisations.

Although this might seem like a shame, if Nvidia puts all its eggs in the two-card basket, perhaps it can optimise performance in those scenarios. That may be extra important too, considering VR has the potential to see much greater benefits from two card set ups, since each one can render for a single eye.

Discuss on our Facebook page, HERE.

KitGuru Says: How many of you have ever run more than two cards in SLI? Personally I've never even run two. It's always just made more sense to buy a card from a newer generation.

The post Forget three, four card SLI with Nvidia GTX 10 series GPUs first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/jon-martindale/forget-three-four-card-sli-with-nvidia-gtx-10-series-gpus/feed/ 20
MSI launches active cooling, three-way and four-way SLI bridge kits https://www.kitguru.net/components/graphic-cards/jon-martindale/msi-launches-active-cooling-four-way-sli-bridge-kits/ https://www.kitguru.net/components/graphic-cards/jon-martindale/msi-launches-active-cooling-four-way-sli-bridge-kits/#comments Thu, 24 Mar 2016 15:16:57 +0000 http://www.kitguru.net/?p=287850 Although multi-GPU set ups are still the preserve or the rich and the obsessive, that doesn't mean there isn't a decent sized audience paying big bucks for major gaming systems. With that in mind, MSI has launched a new SLI bridge kit to make it possible to link up two, three and four graphics cards, …

The post MSI launches active cooling, three-way and four-way SLI bridge kits first appeared on KitGuru.]]>
Although multi-GPU set ups are still the preserve or the rich and the obsessive, that doesn't mean there isn't a decent sized audience paying big bucks for major gaming systems. With that in mind, MSI has launched a new SLI bridge kit to make it possible to link up two, three and four graphics cards, all with additional active cooling to help keep a monstrous set up at a respectable temperature.

MSI previously introduced its two way SLI bridge kit, known as the 2WAY L, but now it's expanding that range to include the 3WAY and 4WAY solutions too. They combine a solid bridge section with the respective connectors, with an extended arm, which adds an extra 120mm fan to improve cooling across the multiple GPUs.

msisli2

There is also some additional styling in the form of LED lighting, which can be controlled through MSI's gaming application and there's also a metallic shroud around the SLI connectors to prevent direct damage and aid in further heat dissipation.

Technically this is part of MSI's Gaming G Series, so it features that logo amongst all of the red and black styling, as well as various Nvidia and SLI logos.

msisli1

KitGuru Says: Despite having had several high-end PCs over the years, I've never run a multi-GPU set up. It always just made more sense to buy a more powerful card from the new generation. What about you guys though? Are you experienced SLI'ers?

 

The post MSI launches active cooling, three-way and four-way SLI bridge kits first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/jon-martindale/msi-launches-active-cooling-four-way-sli-bridge-kits/feed/ 4
Aorus launches X7 and X5S gaming laptops https://www.kitguru.net/lifestyle/mobile/notebook/matthew-wilson/aorus-launches-x7-and-x55-gaming-laptops/ https://www.kitguru.net/lifestyle/mobile/notebook/matthew-wilson/aorus-launches-x7-and-x55-gaming-laptops/#comments Tue, 22 Mar 2016 20:04:02 +0000 http://www.kitguru.net/?p=287628 Aorus has been making quite the name for itself in the gaming notebook space and today, it is expanding its range with the X7 and X5S, bringing raw power in an extremely slim design with Intel Skylake processors, Nvidia graphics and the option for a 4K UHD display. The X5S is first on the list, …

The post Aorus launches X7 and X5S gaming laptops first appeared on KitGuru.]]>
Aorus has been making quite the name for itself in the gaming notebook space and today, it is expanding its range with the X7 and X5S, bringing raw power in an extremely slim design with Intel Skylake processors, Nvidia graphics and the option for a 4K UHD display.

The X5S is first on the list, with a thin and light design, packing an Intel Core i7 6700HQ, 16GB of DDR4 RAM and an 8GB GTX 980m powering the 15.6″ 4K IPS display. Storage needs are also covered with a 256GB mSSD and a 1TB HDD, coming in at £2099.99. 

Aorus

Next up is the Aorus X7 Pro, which focusses on packing in a ton of power with an Intel Core i7 6820HK, 16GB of DDR4 RAM, a 512GB NVMe SSD and dual GTX 970m GPUs running in SLI. There is also a 17.3-inch 1080p display with Nvidia G-Sync technology, allowing the GPUs to control the screen's refresh rate while gaming. This one hits retail at £2399.99 so it is definitely a pricey investment.

KitGuru Says: While both laptops are expensive, they certainly pack in some high-performance components. Are any of you currently looking into grabbing a gaming laptop? Do you have any experience with Aorus?

The post Aorus launches X7 and X5S gaming laptops first appeared on KitGuru.]]>
https://www.kitguru.net/lifestyle/mobile/notebook/matthew-wilson/aorus-launches-x7-and-x55-gaming-laptops/feed/ 3
Windows Store games come with some pretty big limitations https://www.kitguru.net/desktop-pc/matthew-wilson/windows-store-games-come-with-some-pretty-big-limitations/ https://www.kitguru.net/desktop-pc/matthew-wilson/windows-store-games-come-with-some-pretty-big-limitations/#comments Fri, 26 Feb 2016 18:19:03 +0000 http://www.kitguru.net/?p=285478 Microsoft is attempting to crack into PC gaming once again this year with a big push for new games coming to the Windows 10 app store. However, there are some problems with that as it turns out that Windows Store games come with some pretty big limitations, with games like Rise of the Tomb Raider …

The post Windows Store games come with some pretty big limitations first appeared on KitGuru.]]>
Microsoft is attempting to crack into PC gaming once again this year with a big push for new games coming to the Windows 10 app store. However, there are some problems with that as it turns out that Windows Store games come with some pretty big limitations, with games like Rise of the Tomb Raider lacking even the basic option to switch off V-Sync.

When you download a game through the Windows 10 app store, it does not give you a traditional .exe file, instead the game will look something like this in your file directory: C:\Windows\explorer.exe shell:AppsFolder\39C668CD.RiseoftheTombRaider_r7bfsmp40f67j!App. According to user reports, Microsoft's Windows 10 App format lacks support for SLI and CrossFire, Fullscreen/Windowed display modes and apparently switching off V-Sync just can't be done.

2016-01-29_00006-e1454070368906

One of the developers for Nixxes Software, the team behind Rise of the Tomb Raider on PC, confirmed that disabling VSync is not supported in the current UWP framework, which means those buying the game through the Windows Store are stuck with it forced on.

You could try and force multi-GPU support but since there is no traditional .exe file for your drivers to point to, this may not be possible either. This is worrying news as Microsoft is pushing gaming on its Windows 10 Store quite heavily, with games like Quantum Break and Gears of War set to be exclusive. If Microsoft is going to continue doing this, then it will need to update its framework to support PC properly.

KitGuru Says: Obviously we only have Rise of the Tomb Raider as a real example of this at the moment so the situation may have improved a bit by the time Gears of War or Quantum Break hit. Either way, there seems to be some issues with Microsoft's Universal App format that need to be fixed if it is to be used for gaming. 

The post Windows Store games come with some pretty big limitations first appeared on KitGuru.]]>
https://www.kitguru.net/desktop-pc/matthew-wilson/windows-store-games-come-with-some-pretty-big-limitations/feed/ 10
Warner Bros has given up on multi-GPU support for Arkham Knight https://www.kitguru.net/components/graphic-cards/matthew-wilson/warner-bros-has-given-up-on-multi-gpu-support-for-arkham-knight/ https://www.kitguru.net/components/graphic-cards/matthew-wilson/warner-bros-has-given-up-on-multi-gpu-support-for-arkham-knight/#comments Thu, 19 Nov 2015 19:04:28 +0000 http://www.kitguru.net/?p=275878 It looks like the Arkham Knight PC saga is finally over, though it looks like the game was too broken to get all of the features it promised working properly. While Arkham Knight has been available to buy on Steam once again for a few weeks now, we were still expecting one last patch for …

The post Warner Bros has given up on multi-GPU support for Arkham Knight first appeared on KitGuru.]]>
It looks like the Arkham Knight PC saga is finally over, though it looks like the game was too broken to get all of the features it promised working properly. While Arkham Knight has been available to buy on Steam once again for a few weeks now, we were still expecting one last patch for multi-GPU support, unfortunately, it looks like it won't ever be coming.

The game runs okay at this point on single cards but Warner has also been promising an SLI/CrossFire fix since it began outlining its patches over the summer. However, according to one of the developers over at WB, the team just didn't want to risk it anymore.

Batman-Arkham-Knight-release-date-delayed-why

Answering a question on the Steam Forum, a WB developer account said: “We’ve been working with our development and graphics driver partners over the last few months to investigate utilizing multi-GPU support within Batman: Arkham Knight. The result was that even the best case estimates for performance improvements turned out to be relatively small given the high risk of creating new issues for all players.”

Since the gains were minimal, likely due to deeper issues with the game and the way it was created from the beginning, the team stopped pursuing multi-GPU support in-case it caused any more issues, just after finally getting the game working to an acceptable standard.

Discuss on our Facebook page, HERE.

KitGuru Says: Unfortunately, due to the lack of CrossFire or SLI support, it seems playing this game at 4K/60fps will be out of reach with the current crop of high-end GPUs. Are any of you running a multi-GPU set up? I was an SLI user myself until quite recently. 

The post Warner Bros has given up on multi-GPU support for Arkham Knight first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/matthew-wilson/warner-bros-has-given-up-on-multi-gpu-support-for-arkham-knight/feed/ 16
Nvidia gets first samples of GP100 from TSMC, begins internal tests https://www.kitguru.net/components/graphic-cards/anton-shilov/nvidia-receives-first-samples-of-gp100-chips-from-tsmc-begins-to-test-them/ https://www.kitguru.net/components/graphic-cards/anton-shilov/nvidia-receives-first-samples-of-gp100-chips-from-tsmc-begins-to-test-them/#comments Wed, 23 Sep 2015 00:18:13 +0000 http://www.kitguru.net/?p=268782 Taiwan Semiconductor Manufacturing Co. has successfully produced the first samples of Nvidia Corp.’s code-named GP100 graphics processing unit. Nvidia has already started to test the chip internally and should be on-track to release the GPU commercially in mid-2016. 3DCenter reports that Nvidia has sent the first graphics cards based on the GP100 graphics processor to its …

The post Nvidia gets first samples of GP100 from TSMC, begins internal tests first appeared on KitGuru.]]>
Taiwan Semiconductor Manufacturing Co. has successfully produced the first samples of Nvidia Corp.’s code-named GP100 graphics processing unit. Nvidia has already started to test the chip internally and should be on-track to release the GPU commercially in mid-2016.

3DCenter reports that Nvidia has sent the first graphics cards based on the GP100 graphics processor to its subsidiary in India, where it has a lot of hardware and software developers. No actual details about the chip or graphics cards on its base are known, but it is about time for the graphics giant to start testing its GP100.

Nvidia taped out the GP100 in June, 2015. Production cycle of TSMC’s 16nm FinFET process technology is about 90 days, therefore Nvidia got its GP100 from TSMC very recently. Right now the company is testing the chip and its drivers internally.

nvidia_artwork_iron

Nvidia’s GP100 graphics processing unit is based on the “Pascal” architecture and is made using 16nm FinFET+ process technology. The chip is expected to integrate up to 6000 stream processors and contain around 17 billion transistors. Graphics cards featuring the GP100 will carry up to 32GB of HBM2 memory.

Nvidia did not comment on the news-story.

Discuss on our Facebook page, HERE.

KitGuru Says: It is about time for Nvidia to start testing its GP100 now. What remains to be seen is when exactly the company plans to formally introduce its next-generation GPUs. If the first revision of the chip is fully functional, the company may move in introduction of the GP100 to the first quarter of the year.

The post Nvidia gets first samples of GP100 from TSMC, begins internal tests first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/anton-shilov/nvidia-receives-first-samples-of-gp100-chips-from-tsmc-begins-to-test-them/feed/ 20
Nvidia changes roadmap: ‘Volta’ is now due in 2018 https://www.kitguru.net/components/graphic-cards/anton-shilov/nvidia-changes-roadmap-volta-is-now-due-in-2018/ https://www.kitguru.net/components/graphic-cards/anton-shilov/nvidia-changes-roadmap-volta-is-now-due-in-2018/#comments Tue, 22 Sep 2015 01:14:41 +0000 http://www.kitguru.net/?p=268590 Nvidia Corp. has slightly changed its roadmap concerning GPU architectures. As it appear, its next-gen GPUs code-named “Pascal” are now due in 2016, whereas their successors will be released only in 2018. Based on a new roadmap that Nvidia showcased at a tech conference in Japan, the company will release its code-named “Pascal” GPUs in 2016 …

The post Nvidia changes roadmap: ‘Volta’ is now due in 2018 first appeared on KitGuru.]]>
Nvidia Corp. has slightly changed its roadmap concerning GPU architectures. As it appear, its next-gen GPUs code-named “Pascal” are now due in 2016, whereas their successors will be released only in 2018.

Based on a new roadmap that Nvidia showcased at a tech conference in Japan, the company will release its code-named “Pascal” GPUs in 2016 and will follow on with “Volta” graphics processors in 2018. The “Pascal” chips will be made using 16nm FinFET process technology and will be available in 2016, reports WccfTech. Previously “Volta” was expected in 2017.

NVIDIA-Pascal-GPU_Roadmap

Not a lot is known about the first “Pascal” GPU. Nvidia has reportedly taped out its GP100 graphics processor back in June. Given the timeframe of the tape-out, it is highly likely that Nvidia uses TSMC’s advanced 16nm FinFET+ (CLN16FF+) manufacturing technology. Nvidia has changed its approach to roll-out of new architectures. Instead of starting from simple GPUs and introducing biggest processors quarters after the initial chips, Nvidia will begin to roll-out 16nm “Pascal” GPUs with the largest chip in the family.

Nvidia did not comment on the news-story.

Discuss on our Facebook page, HERE.

KitGuru Says: It looks like Nvidia is pulling in “Pascal”, but slightly delays “Volta”. The reason for this is simple: 10nm process technology. At TSMC it will only be available for Nvidia in 2018.

The post Nvidia changes roadmap: ‘Volta’ is now due in 2018 first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/anton-shilov/nvidia-changes-roadmap-volta-is-now-due-in-2018/feed/ 32
Nvidia’s ‘Big Pascal’ GPU reportedly taped-out, on-track for 2016 launch – rumour https://www.kitguru.net/components/graphic-cards/anton-shilov/nvidias-big-pascal-gpu-reportedly-taped-out-on-track-for-2016-launch-rumour/ https://www.kitguru.net/components/graphic-cards/anton-shilov/nvidias-big-pascal-gpu-reportedly-taped-out-on-track-for-2016-launch-rumour/#comments Fri, 05 Jun 2015 22:33:26 +0000 http://www.kitguru.net/?p=253018 Nvidia Corp. has reportedly taped out its next-generation high-performance graphics processing unit that belongs to the “Pascal” family, according to a market rumour. If the information is correct, then Nvidia is on-track to release its new GPU around mid-2016. The company needs its “Big Pascal” graphics processor to build next-generation Tesla accelerators for high-performance computing applications …

The post Nvidia’s ‘Big Pascal’ GPU reportedly taped-out, on-track for 2016 launch – rumour first appeared on KitGuru.]]>
Nvidia Corp. has reportedly taped out its next-generation high-performance graphics processing unit that belongs to the “Pascal” family, according to a market rumour. If the information is correct, then Nvidia is on-track to release its new GPU around mid-2016. The company needs its “Big Pascal” graphics processor to build next-generation Tesla accelerators for high-performance computing applications and better compete against AMD on the market of consumer GPUs.

An anonymous person presumably with access to confidential information in the semiconductor industry revealed in a post over at Beyond3D forums that Nvidia had already taped out its next-generation code-named GP100 graphics processing unit. Nowadays, a tape-out means that the design of an integrated circuit has been finalized, but the first actual chips materialize only months after their tape-out.

Tape-out is the final stage of the design cycle of an integrated circuit, the point at which the artwork of the IC is sent to a maker of photomasks. Once the set of photolithographic masks is ready and verified, it is sent to a contract manufacturer of the chip, which produces the first working samples of the IC. Today’s mask sets contain 50 – 70 (up to 100) photomasks and it takes 15 – 20 hours to write a typical mask. It may take several weeks to prepare a mask-set. Production cycle of a complex FinFET processor is around 90 days from wafer start to chip delivery. As a result, it takes several months to prepare a set of photomasks and build an actual chip nowadays. Hence, if Nvidia taped-out of the GP100 in May, then the company will get the first samples of its GP100 in August. Nowadays high-volume production of chips starts between nine and twelve months after the initial tape-out.
nvidia_artworklatesteyeweb_mid
The world’s No. 1 producer of discrete graphics processors will reportedly use one of Taiwan Semiconductor Manufacturing Co.’s 16nm FinFET fabrication technology to make its “Big Pascal” GPU. Given the timeframe of the tape-out, it is highly likely that Nvidia uses TSMC’s advanced 16nm FinFET+ (CLN16FF+) manufacturing technology. According to the post, the BP100 is Nvidia’s first 16nm FinFET chip and the company has changed its approach to roll-out of new architectures. Instead of starting from simple GPUs and introducing biggest processors quarters after the initial chips, Nvidia will begin to roll-out “Pascal” with the largest chip in the family.

Nvidia’s “Pascal” architecture represents a big leap for the company. Thanks to all-new architecture, the Nvidia's next-gen GPUs will support many new features introduced by DirectX 12+, Vulkan and OpenCL application programming interfaces. The 16nm FinFET process technology will let Nvidia engineers to integrate considerably more stream processors and other execution units compared to today’s GPUs, significantly increasing overall performance. In addition, next-generation graphics processing units from Nvidia will support second-generation stacked high-bandwidth memory (HBM2). The HBM2 will let Nvidia and its partners build graphics boards with 16GB – 32GB of onboard memory and 820GB/s – 1TB/s of bandwidth. For high-performance computing (HPC) applications, the “Big Pascal” chip will integrate NVLink interconnection tech with 80GB/s or higher bandwidth, which will significantly increase performance of “Pascal”-based Tesla accelerators in supercomputers. Moreover, NVLink could bring major improvements to multi-GPU technologies thanks to massive bandwidth for inter-GPU communications. According to Nvidia's estimates, graphics adapters based on “Pascal” architecture should deliver two to ten times higher performance than comparable graphics processors today in peak scenarios.

NVIDIA_Tesla_K80_Dual-GPU_Accelerator_Front

Nvidia needs GP100 chip in order to build next-generation Tesla accelerators for supercomputers. Since “Maxwell” architecture (even the GM200) lacks native support for double precision (DP) FP64 computing, it cannot be used for Tesla cards. As a result, Nvidia currently offers Tesla accelerators featuring GK110 and GK210 chips, which are basically three years old. The release of the “Big Pascal” will help Nvidia to boost sales of Tesla cards for HPC applications.

The accuracy of predictions of the Beyond3D forum member could not be verified, but some of his previous posts indicate that he has access to information that is not yet public. The post in the forum on Friday was republished by 3DCenter, a web-site known for predictions in the field of graphics processing units.

Nvidia did not comment on the news-story.

Discuss on our Facebook page, HERE.

KitGuru Says: On the one hand, the rumour comes from one source without a track record and should be taken with a huge grain of salt. On the other hand, Nvidia needs “Big Pascal” to update Tesla accelerators as soon as possible. If Nvidia wants to release its GP100-based products in mid-2016, then the chip has been taped-out by now.

The post Nvidia’s ‘Big Pascal’ GPU reportedly taped-out, on-track for 2016 launch – rumour first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/anton-shilov/nvidias-big-pascal-gpu-reportedly-taped-out-on-track-for-2016-launch-rumour/feed/ 62
Aorus begins to sell its X5 laptop: Core i7, GeForce GTX 965M SLI, 3K IPS display https://www.kitguru.net/desktop-pc/anton-shilov/aorus-begins-to-sell-its-x5-laptop-core-i7-geforce-gtx-965m-sli-4k-ips-display/ https://www.kitguru.net/desktop-pc/anton-shilov/aorus-begins-to-sell-its-x5-laptop-core-i7-geforce-gtx-965m-sli-4k-ips-display/#comments Wed, 03 Jun 2015 02:46:52 +0000 http://www.kitguru.net/?p=252472 Aorus, a division of Gigabyte Technology that specializes on gaming laptops, on Tuesday began to sell its Aorus X5 notebooks introduced earlier this year. The company had to slightly change configuration of the mobile PCs, but the laptop is still among the highest-performing slim notebooks on the planet. The Aorus X5 notebook is based on the …

The post Aorus begins to sell its X5 laptop: Core i7, GeForce GTX 965M SLI, 3K IPS display first appeared on KitGuru.]]>
Aorus, a division of Gigabyte Technology that specializes on gaming laptops, on Tuesday began to sell its Aorus X5 notebooks introduced earlier this year. The company had to slightly change configuration of the mobile PCs, but the laptop is still among the highest-performing slim notebooks on the planet.

The Aorus X5 notebook is based on the Intel Core i7-5700HQ “Broadwell” processor as well as two Nvidia GeForce GTX 965M graphics adapters in SLI mode. The system is equipped with 16GB DDR3 memory, 256GB or 512GB solid-state drive, 1TB hard disk drive, Killer Networking LAN chip with 802.11ac WiFi support and so on. The mobile PC can be further upgraded to support up to 32GB of memory, up to three M.2 solid-state drives and maybe even faster graphics processors.

aorus_x7_laptop_nvidia_g_sync_geforce

The new laptops from Aorus are equipped with 15.6″ IPS display panels with 2880*1620 resolution and rather high luminance of 350 cd/m². The notebooks feature Nvidia’s G-Sync direct that dynamically synchronizes refresh rate of a display to the framerate of the outputting graphics processing unit. Earlier this year Aorus planned to use a 4K IGZO panel from Sharp.

The Aorus X5 gaming notebooks boast with extreme performance that is comparable to that of modern desktops. However, it is not very thick (22.9mm) or heavy (2.5 kilograms). In fact, its weight and dimensions are similar to those of non-gaming laptops released several years ago. In a bid to ensure maximum stability, the X5 employs an advanced cooling system with eight thermal pipes, four vents and two fans. Thermal parts have been placed at the rear in a bid to ensure maximum coolness for wrists. The Aorus laptops also feature advanced LED-backlit keyboard with programmable buttons and macros.

aorus_x5

Aorus will start to sell its X5 laptops in the U.K. shortly at Overclockers UK. Two configurations will be available:

  • Aorus X5/15.6″ WQHD+ 2880×1620/Intel Core i7-5700HQ/Nvidia GeForce GTX 965M SLI GDDR5 8GB/DRAM 16GB/ SSD 512GB/HDD 1TB/ G-Sync – £1,899 including VAT;
  • Aorus X5/15.6″ WQHD+ 2880×1620/Intel Core i7-5700HQ/ GeForce GTX 965M SLI GDDR5 8GB/DRAM 16GB/ SSD 256GB/HDD 1TB/G-Sync – £1,799 including VAT.

Discuss on our Facebook page, HERE.

KitGuru Says: Although it is a little bit sad that Aorus decided not to use 4K ultra HD display on its X5 laptops, the 2880*1620 resolution seems to be more optimal for a 15.6” display. Moreover, lower resolution also means higher performance in games. All-in-all, both Aorus X5 models seem to be decent gaming machines.

The post Aorus begins to sell its X5 laptop: Core i7, GeForce GTX 965M SLI, 3K IPS display first appeared on KitGuru.]]>
https://www.kitguru.net/desktop-pc/anton-shilov/aorus-begins-to-sell-its-x5-laptop-core-i7-geforce-gtx-965m-sli-4k-ips-display/feed/ 2
Jen-Hsun Huang: Every newborn human will be a gamer https://www.kitguru.net/components/graphic-cards/anton-shilov/jen-hsun-huang-every-newborn-human-will-be-a-gamer/ https://www.kitguru.net/components/graphic-cards/anton-shilov/jen-hsun-huang-every-newborn-human-will-be-a-gamer/#comments Fri, 08 May 2015 22:59:29 +0000 http://www.kitguru.net/?p=248715 Just 50 years ago the number of people, who played video games, was less than a hundred on the planet. Nowadays hundreds of millions of people play games on various devices from smartphones to premium PCs built by boutique PC makers. In the future, the number of gamers will increase even further as every newborn child will …

The post Jen-Hsun Huang: Every newborn human will be a gamer first appeared on KitGuru.]]>
Just 50 years ago the number of people, who played video games, was less than a hundred on the planet. Nowadays hundreds of millions of people play games on various devices from smartphones to premium PCs built by boutique PC makers. In the future, the number of gamers will increase even further as every newborn child will eventually become a gamer, according to chief exec of Nvidia.

“I think it is also pretty clear that almost every new human is a gamer,” said Jen-Hsun Huang, chief executive officer of Nvidia, at the company’s quarterly conference call with investors and financial analysts. “The previous generation before me, very few were gamers. My generation, I would say, probably is 25% gamers. My kids’ generation is probably 75% gamers, and the generation after that is got to be 100% gamers.”

Computing devices are becoming more ubiquitous than ever these days. Just 20 years ago not a lot of people had PCs at home, video game consoles were far less popular than today and mobile phones were rare. Today, an average person owns at least one PC, a tablet, a smartphone and hundreds of millions people have consoles in their living rooms. Because the installed base of devices that can be used to play video games is increasing rapidly these days, gaming is becoming pervasive.

“Video games is no longer a niche,” said Mr. Huang. “Game is really a pop culture now, and we expect that gaming to continue to expand.”

nvidia_artwork

Nvidia’s business heavily depends on gaming. The company sells graphics processors that are utilized inside high-performance personal computers, which are used to play games. Nvidia also sells advanced multimedia system-on-chips, which can be used inside advanced media tablets or game consoles. Therefore, it is not surprising that the company’s chief executive officer believes in video games.

Discuss on our Facebook page, HERE.

KitGuru Says: Without any doubts, video games industry will continue to grow and evolve. Who knows what kind of devices will be used for gaming tomorrow?

The post Jen-Hsun Huang: Every newborn human will be a gamer first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/anton-shilov/jen-hsun-huang-every-newborn-human-will-be-a-gamer/feed/ 3
Microsoft details cross-IHV multi-GPU tech in DirectX 12 https://www.kitguru.net/components/graphic-cards/anton-shilov/microsoft-details-cross-ihv-multi-gpu-tech-in-directx-12/ https://www.kitguru.net/components/graphic-cards/anton-shilov/microsoft-details-cross-ihv-multi-gpu-tech-in-directx-12/#comments Fri, 08 May 2015 01:14:10 +0000 http://www.kitguru.net/?p=248494 Microsoft Corp. has revealed first details about its cross-vendor multi-GPU technology supported by its upcoming DirectX 12 application programming interface. As reported, the new tech allows using graphics processing units developed by different companies to work together. However, there is a catch: applications should be designed to support the Multiadapter feature and real-world performance benefits …

The post Microsoft details cross-IHV multi-GPU tech in DirectX 12 first appeared on KitGuru.]]>
Microsoft Corp. has revealed first details about its cross-vendor multi-GPU technology supported by its upcoming DirectX 12 application programming interface. As reported, the new tech allows using graphics processing units developed by different companies to work together. However, there is a catch: applications should be designed to support the Multiadapter feature and real-world performance benefits of the new tech are projected to be rather modest.

The DirectX 12 is a low-level/high-throughput API, which has a close-to-metal access to GPU and CPU resources. Low-level access to hardware allows to split the rendering workload across different graphics processors. Since in many cases multi-GPU technologies require synchronization between different graphics chips, which is not an easy thing to do even in hardware, Microsoft proposes to use its Multiadapter for separable and contiguous workloads. One example of separable workloads is post-processing (i.e., antialiasing, depth-of-field, etc.).

nvidia_battlebox_4K_uhd_sli_geforce_gtx_titan

If an application “knows” how to render a scene on one GPU and post-process it on another, once it detects two graphics processors, it can use both. For example, millions of modern gaming personal computers powered by mainstream Intel microprocessors feature Intel integrated graphics cores in addition to discrete graphics cards. At present integrated graphics engines are not used for rendering, but future apps that can take advantage of the Multiadapter technology will be able to use those integrated graphics cores for separable and contiguous workloads.

“Virtually every game out there makes use of post-processing to make your favorite games visually impressive, but that post-processing work doesn’t come free,” explained Andrew Yeung, a program manager in the Microsoft graphics team. “By offloading some of the post-processing work to a second GPU, the first GPU is freed up to start on the next frame before it would have otherwise been able to improving your overall framerate.”

To demonstrate the benefits of its Multi-adapter technology, Microsoft benchmarked two test systems in a DX12 Unreal Engine 4-based Elemental benchmark. One PC was equipped with a single Nvidia GeForce GTX graphics card, whereas another featured the same board and an Intel integrated graphics processing unit. The Multi-adapter system scored 39.7 frames per second, whereas the PC with one graphics card scored 35.9 fps. 10 per cent difference is clearly not something big, but in certain cases even small performance gains may be important. Microsoft claims that utilization of Intel’s GPU was about 70 per cent, which means that eventually even more performance could be extracted.

3730_multiadapter-dx12-ue4-2

Usage of heterogeneous multi-GPU is not limited to post-processing. Secondary GPUs could be used for other workloads as well. However, the applications should be Multiadapter-compatible to use cross-vendor multi-GPU configurations. Microsoft did not reveal whether it was easy or hard to add Multiadapter support to programs. As a result, it is unclear whether game developers will actually use the technology to improve performance.

One thing that should be noted is that Multi-adapter is not a technology designed solely to boost performance of multi-GPU systems. The Multi-adapter tech allows to precisely assign different workloads on different GPUs, something that could open doors to never-before-seen rendering methods.

Discuss on our Facebook page, HERE.

KitGuru Says: In theory, Microsoft’s Multiadapter technology should provide a number of interesting technological opportunities. Its adoption by game developers will depend on the cost of its implementation and actual benefits. 10 per cent performance boost is good for game consoles, but for PCs it may not make a lot of sense for game developers.

The post Microsoft details cross-IHV multi-GPU tech in DirectX 12 first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/anton-shilov/microsoft-details-cross-ihv-multi-gpu-tech-in-directx-12/feed/ 8
OcUK Ultima Finesse Blackhole Gaming PC Review https://www.kitguru.net/desktop-pc/gaming-rig/luke-hill/ocuk-ultima-finesse-blackhole-gaming-pc-review/ https://www.kitguru.net/desktop-pc/gaming-rig/luke-hill/ocuk-ultima-finesse-blackhole-gaming-pc-review/#comments Tue, 31 Mar 2015 07:40:58 +0000 http://www.kitguru.net/?p=241954 Nvidia's GTX 970 sits in that graphics card sweet-spot where it's about 10-20% slower than its GTX 980 brother, but retails for only 60% of the price. It's that alluring balance between performance and value that makes the GTX 970 a smart candidate for constructing a strong SLI configuration. OcUK realises that multi-GPU potential by …

The post OcUK Ultima Finesse Blackhole Gaming PC Review first appeared on KitGuru.]]>

Nvidia's GTX 970 sits in that graphics card sweet-spot where it's about 10-20% slower than its GTX 980 brother, but retails for only 60% of the price. It's that alluring balance between performance and value that makes the GTX 970 a smart candidate for constructing a strong SLI configuration.

OcUK realises that multi-GPU potential by offering up to a pair of KFA2 GTX 970 Infinity Black Edition graphics cards in its Ultima Finesse Blackhole system. Forging the path for the GTX 970s are Intel's competitive Core i7 5820K CPU and an MSI X99S SLI Plus motherboard.

main-image-1

OcUK wraps up the SLI-capable system in NZXT’s competitively-priced S340 chassis, giving users excellent viewing angles of the internals. Cooling the 3.9GHz-clocked 5820K is a Prolimatech Genesis, which also provides incidental airflow for the motherboard and memory. Samsung’s 250GB 840 EVO SSD and Seagate’s 2TB SSHD are called upon for storage purposes.

Fiasco regarding the 3.5GB of high-speed memory aside, the GTX 970 is still an impressive graphics card. Does the pair of custom-cooled, factory-overclocked KFA2 Infinity Black Edition cards have what it takes to deliver the goods at a 2560×1440 gaming resolution?

System Configuration:

  • CPU: Intel Core i7 5820K Six Core Haswell-E Processor overclocked to 3.9GHz
  • Motherboard: MSI X99S SLI Plus (Socket 2011) DDR4 ATX Motherboard
  • Graphics Cards: 2x KFA2 GTX 970 Infinity Black Edition 4GB Graphics Card SLI
  • Case: NZXT Source 340 Gaming Case – Black
  • Power Supply: Super Flower 750W Gold PSU
  • Cooler: Prolimatech Genesis Dual Fan CPU Cooler
  • RAM: 16GB (4x4GB) DDR4 2400MHz Quad Channel Kit
  • SSD: 250GB Samsung 840 EVO
  • Hard Drive: 2TB Seagate SSHD
  • Sound: High Definition 7.1 On-board Sound Card
  • Operating system: Windows 7 or Window 8.1
  • Warranty: 3 years standard (up to 5 years available to purchase)
  • Price as configured: £1870.92 (at the time of writing)
The post OcUK Ultima Finesse Blackhole Gaming PC Review first appeared on KitGuru.]]>
https://www.kitguru.net/desktop-pc/gaming-rig/luke-hill/ocuk-ultima-finesse-blackhole-gaming-pc-review/feed/ 1
Nvidia: ‘Pascal’ architecture’s NVLink to enable 8-way multi-GPU capability https://www.kitguru.net/components/graphic-cards/anton-shilov/nvidia-pascal-architectures-nvlink-to-enable-8-way-multi-gpu-capability/ https://www.kitguru.net/components/graphic-cards/anton-shilov/nvidia-pascal-architectures-nvlink-to-enable-8-way-multi-gpu-capability/#comments Thu, 19 Mar 2015 02:42:37 +0000 http://www.kitguru.net/?p=240836 Compute performance of modern graphics processing units (GPUs) is tremendous, but so are the needs of modern applications that use such chips to display beautiful images or perform complex scientific calculations. Nowadays it is rather impossible to install more than four GPUs into a computer box and get adequate performance scaling. But brace yourself as …

The post Nvidia: ‘Pascal’ architecture’s NVLink to enable 8-way multi-GPU capability first appeared on KitGuru.]]>
Compute performance of modern graphics processing units (GPUs) is tremendous, but so are the needs of modern applications that use such chips to display beautiful images or perform complex scientific calculations. Nowadays it is rather impossible to install more than four GPUs into a computer box and get adequate performance scaling. But brace yourself as Nvidia is working on eight-way multi-GPU technology.

The vast majority of personal computers today have only one graphics processor, but many gaming PCs used to play games integrate two graphics cards for increased framerate. Enthusiasts, who want to have unbeatable performance in select games and benchmarks opt for three-way or four-way multi-GPU setups, but these are pretty rare because scaling beyond two GPUs is not really high. Professionals, who need high-performance GPUs for simulations, deep learning and other applications also benefit from four graphics processors and could use even more GPUs per box. Unfortunately, that is virtually impossible because of limitations imposed by today’s PCI Express and SLI technologies. However, Nvidia hopes that with the emergence of the code-named “Pascal” GPUs and NVLink bus, it will be considerably easier to build multi-GPU machines.

nvidia_battlebox_4K_uhd_sli_geforce_gtx_titan

Today even the top-of-the-range Intel Core i7-5960X processor has only 40 PCI Express 3.0 lanes (up to 40GB/s of bandwidth), thus, can connect up to two graphics cards using PCIe 3.0 x16 or up to four cards using PCIe 3.0 x8 bus. In both cases, maximum bandwidth available for GPU-to-GPU communications will be limited to 16GB/s or 8GB/s (useful bandwidth will be around 12GB/s and 6GB/s) in the best case scenarios since GPUs need to communicate with the CPU too.

nvidia_pascal_nvlink_0

In a bid to considerably improve communication speed between GPUs, Nvidia will implement support of proprietary NVLink bus into its next-generation “Pascal” GPUs. Each NVLink point-to-point connection will support 20GB/s of bandwidth in both directions simultaneously (16GB/s effective bandwidth in both directions) and each “Pascal” high-end GPU will support at least four of such links. In case a of a system with NVLink, two GPUs would get a total peak bandwidth of 80GB/s (64GB/s effective) per direction between them. Moreover, PCI Express bandwidth would be preserved for CPU-to-GPU communications. In case of four-GPU sub-system, graphics processors would get up to 40GB/s bandwidth to communicate with each other.

nvidia_pascal_nvlink

According to Nvidia, NVLink is projected to deliver up to two times higher performance in many applications simply by replacing the PCIe interconnect for communication among peer GPUs. It should be noted that in an NVLink-enabled system, CPU-initiated transactions such as control and configuration are still directed over a PCIe connection, while any GPU-initiated transactions use NVLink, which allows to preserve the PCIe programming model.

nvidia_pascal_nvlink_1

Additional bandwidth provided by NVLink could allow one to build a personal computer with up to eight GPUs. However, to make it useful in applications beyond technical computing, Nvidia will have to find a way to efficiently use eight graphics cards for rendering. Since performance scaling beyond two GPUs is generally low, it is unlikely that eight-way multi-GPU technology will actually make it to the market. However, if Nvidia manages to improve efficiency of current multi-GPU technologies in general by replacing SLI [scalable link interface] with NVLink, that could further boost popularity of the company’s graphics cards among gamers.

Performance improvement could be even more significant in systems that completely rely on NVLink instead of PCI Express. IBM plans to add NVLink to select Power microprocessors for supercomputers and the technology will be extremely useful for high-performance servers powered by Nvidia Tesla accelerators.

Discuss on our Facebook page, HERE.

KitGuru Says: While actual video games will hardly be able to use eight GPUs efficiently, we are pretty sure that 3DMark benchmarks will set records on extreme many-GPU graphics sub-systems.

The post Nvidia: ‘Pascal’ architecture’s NVLink to enable 8-way multi-GPU capability first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/anton-shilov/nvidia-pascal-architectures-nvlink-to-enable-8-way-multi-gpu-capability/feed/ 8
Microsoft confirms cross-IHV ‘AMD + Nvidia’ multi-GPU support by DX12 https://www.kitguru.net/components/graphic-cards/anton-shilov/microsoft-confirms-cross-ihv-amd-nvidia-multi-gpu-tech-support-by-dx12/ https://www.kitguru.net/components/graphic-cards/anton-shilov/microsoft-confirms-cross-ihv-amd-nvidia-multi-gpu-tech-support-by-dx12/#comments Thu, 12 Mar 2015 21:18:07 +0000 http://www.kitguru.net/?p=239638 Microsoft Corp. has confirmed that its upcoming DirectX 12 application programming interface (API) will support some sort of cross-vendor multi-GPU technology that will allow graphics processing units from different developers to work together at the same time. Unfortunately, the software developer revealed no specifics about the tech, therefore, it does not mean that one could …

The post Microsoft confirms cross-IHV ‘AMD + Nvidia’ multi-GPU support by DX12 first appeared on KitGuru.]]>
Microsoft Corp. has confirmed that its upcoming DirectX 12 application programming interface (API) will support some sort of cross-vendor multi-GPU technology that will allow graphics processing units from different developers to work together at the same time. Unfortunately, the software developer revealed no specifics about the tech, therefore, it does not mean that one could actually benefit by using a Radeon and a GeForce in the same system.

A Microsoft technical support staff member said that DirectX 12 will support “multi-GPU configurations between Nvidia and AMD”, according to a screenshot published over at LinusTechTips, a community known for various interesting findings. The Microsoft representative did not reveal requirements necessary to make an AMD Radeon and an Nvidia GeForce run in tandem, besides, she also did not indicate actual benefits of such setup.

microsoft_directx_logo_12

Various types of multi-GPU technologies are used to accomplish various tasks. Gamers utilize more than one graphics card in AMD CrossFire or Nvidia SLI configurations to get higher framerates in modern video games. Professionals use multiple graphics adapters to attach many displays to one PC. Engineers can use different types of add-in-boards (AIBs) for rendering (AMD FirePro or Nvidia Quadro) and simulation (Nvidia Tesla or Intel Xeon Phi). Artists and designers can use several GPUs for final rendering in ultra-high resolutions using ray-tracing or similar methods. While in some cases (e.g., driving multiple displays, rendering + simulation, etc.) it is possible to use AIBs from different developers, in many other cases (gaming, ray-tracing, etc.) it is impossible due to various limitations. Independent hardware vendors (IHV) also do not exactly like heterogeneous multi-GPU configurations, which is why it is impossible to use AMD Radeon for rendering and Nvidia GeForce for physics computing in games that use PhysX engine from Nvidia.

At present it is unclear which multi-GPU configurations were referred to by the Microsoft representative.

post-172393-0-56518600-1426041327
DISCLAIMER: KitGuru cannot verify identity of the Microsoft representative and authenticity of the screenshot.

While it is obvious that DirectX 12 will support contemporary cross-IHV multi-GPU configurations, a previous report suggested that the upcoming API will also allow to use graphics processors from different vendors in cases not possible today, e.g., for rendering video games. There are multiple reasons why it is impossible to use GPUs from different vendors for real-time rendering today, including tricky rendering methods, API limitations, application limitations, differences in GPU architectures, driver limitations and so on. Microsoft’s DirectX 12 potentially removes API-related limitations in certain cases and introduces a number of new techniques that allow to use resources of two graphics cards at the same time.

Since DX12 is a low-level/high-throughput API, it should have a close-to-metal access to GPU resources and in many cases this should provide a number of interesting technological opportunities. Among other things, the new API should allow usage of cross-IHV multi-GPU configurations in applications that were made with DX12 and heterogeneous multi-GPU setups in mind. For example, it should be possible to use graphics chips from different vendors for ray-tracing, compute-intensive tasks and similar workloads. Since architectures of GeForce and Radeon graphics pipelines are vastly different, using two graphics cards from different vendors for real-time latency-sensitive rendering of modern video games that use contemporary 3D engines should be extremely complicated. All multi-GPU technologies used for real-time rendering require two GPUs to be synchronized not only in terms of feature-set, but also in terms of performance, memory latencies and so on.

nvidia_battlebox_4K_uhd_sli_geforce_gtx_titan

Given all the difficulties with synchronization of cross-IHV multi-GPU setups for latency-sensitive real-time rendering, it is unlikely that such a technology could take off. However, it should be kept in mind that DirectX 12 is designed not only for personal computers, but also for Xbox One. Microsoft has been experimenting with cloud-assisted AI and physics computations for Xbox One (both are latency-insensitive). Therefore, at least theoretically, there is a technology that lets developers (or even programs) to perform different tasks using different hardware resources without need for real-time synchronization.

Perhaps, DX12 will let apps use “secondary” GPUs for certain tasks without significant synchronization-related issues. The only question is whether AMD and Nvidia would be glad about heterogeneous multi-GPU configurations in general and will not block them in their drivers.

AMD and Nvidia declined to comment on the news-story.

Discuss on our Facebook page, HERE.

KitGuru Says: Considering the fact that cross-IHV multi-GPU configurations in DirectX 12 is something completely unclear at the moment, take everything reported about this with a huge grain of salt. At least in theory, heterogeneous multi-GPU technologies for video games are possible, only these are not CrossFire or SLI configurations we know today. In short, do not expect your old Radeon HD 7970 to significantly boost performance of your shiny new GeForce GTX 980 once DirectX 12 is here.

The post Microsoft confirms cross-IHV ‘AMD + Nvidia’ multi-GPU support by DX12 first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/anton-shilov/microsoft-confirms-cross-ihv-amd-nvidia-multi-gpu-tech-support-by-dx12/feed/ 18
DirectX12 could bring AMD and NVIDIA together https://www.kitguru.net/components/graphic-cards/brendan-morgan/directx12-could-bring-amd-and-nvidia-together/ https://www.kitguru.net/components/graphic-cards/brendan-morgan/directx12-could-bring-amd-and-nvidia-together/#comments Tue, 24 Feb 2015 21:46:42 +0000 http://www.kitguru.net/?p=237363 In a pretty exciting rumor today there have been claims that the new DirectX standard coming with Windows 10, could allow you to use graphics card from both AMD and NVIDIA. The so called, Explicit Asynchronous Multi-GPU capabilities could bring all of the GPU power in a system into a pool so that all GPUs can …

The post DirectX12 could bring AMD and NVIDIA together first appeared on KitGuru.]]>
In a pretty exciting rumor today there have been claims that the new DirectX standard coming with Windows 10, could allow you to use graphics card from both AMD and NVIDIA. The so called, Explicit Asynchronous Multi-GPU capabilities could bring all of the GPU power in a system into a pool so that all GPUs can then work together on each frame.

Toms Hardware, who sourced this information say that “A source with knowledge of the matter gave us some early information about an “unspoken API,”” This is presumed to be DirectX12. They continue, “If you like Nvidia's GeForce Experience software and 3D Vision, but you want to use AMD's TrueAudio and FreeSync, chances are you'll be able to do that when DirectX 12 comes around.”
NvidiavsAMD
Of course if this rumor is correct, this is great news for us gamers, we would no longer have to decide between which graphics chip we want to stick with. If AMD cards are the best value for money we can buy one of them, then in a few months we can just slot in a new top of the line NVIDIA card as well.

This is something that has been tried previously to some extent on a few motherboards, for instance the MSI Fuzion Technology range has supported graphics cards from different manufacturers on the same board. The system that DirectX12 is using however, could even replace proprietary NVIDIA SLI and ATI Crossfire multi card systems, as they simply would no longer be required. It could also allow totally different generations of cards to be used together, something that SLI does not allow.

Discuss on our Facebook page, HERE.

KitGuru Says: This could be a massive deal if it is indeed true and isn't too hard for developers to implement. Having said that I can imagine that neither AMD nor NVIDIA are going to like this. The thought of their cards being used along side the competition doesn't sound like something either company would like. What do you think of this rumor?

Image Source: eteknix

The post DirectX12 could bring AMD and NVIDIA together first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/brendan-morgan/directx12-could-bring-amd-and-nvidia-together/feed/ 45
Nvidia plans to add G-Sync technology to laptops https://www.kitguru.net/desktop-pc/anton-shilov/nvidia-plans-to-add-g-sync-technology-to-laptops/ https://www.kitguru.net/desktop-pc/anton-shilov/nvidia-plans-to-add-g-sync-technology-to-laptops/#comments Thu, 08 Jan 2015 15:55:06 +0000 http://www.kitguru.net/?p=229602 Without any doubts, gaming notebooks are getting more and more popular these days because of many industry trends. As a result, they are gradually gaining technologies we typically find on desktop computers. In the recent years laptops gained unlocked microprocessors, overclocking memory and multi-GPU technologies. This year Nvidia Corp. will add its highly acclaimed G-Sync …

The post Nvidia plans to add G-Sync technology to laptops first appeared on KitGuru.]]>
Without any doubts, gaming notebooks are getting more and more popular these days because of many industry trends. As a result, they are gradually gaining technologies we typically find on desktop computers. In the recent years laptops gained unlocked microprocessors, overclocking memory and multi-GPU technologies. This year Nvidia Corp. will add its highly acclaimed G-Sync to the list.

Nvidia and its partners are working to add G-Sync technology, which dynamically matches a GPU’s rendering rate with display’s refresh rate, to mobile computers, reports SweClockers. One of the first notebooks to support Nvidia’s G-Sync is Aorus X7 Pro, which sports two GeForce GTX 970M graphics adapters in SLI mode. At present, the GPU developer is polishing off its drivers and plans to add the feature officially in the course of the first quarter of 2015.

aorus_x7_laptop_nvidia_g_sync_geforce

The report does not reveal a lot of details about the implementation of the technology, which is something very important. Nvidia and its partners use a custom proprietary display scaler to enable the G-Sync on desktops and plan to continue doing so in the future. Theoretically, Nvidia could add the chip to laptops too in order to unify the implementation of the technology on different platforms. Still, on notebooks everything could be done in a very different way. Virtually all modern laptop displays are connected to graphics adapters using embedded DisplayPort (eDP) technology, which supports a feature called Panel Self-Refresh, something that allows to control refresh-rate of the monitor. Using PSR, Nvidia could add G-Sync to loads of existing gaming notebooks, it is just a matter of time and testing.

AMD uses PSR to implement its FreeSync technology on desktops. In fact, the first demonstrations of FreeSync were conducted on notebook computers since the technology was all there.

Discuss on our Facebook page, HERE.

KitGuru Says: If Nvidia decides to use PSR instead of a proprietary display scaler, it will mean that it will support VESA’s Adaptive-Sync technology. If the company decides to use a proprietary chip instead of an industry-standard one, then this will naturally make G-Sync-enabled mobile PCs more expensive.

The post Nvidia plans to add G-Sync technology to laptops first appeared on KitGuru.]]>
https://www.kitguru.net/desktop-pc/anton-shilov/nvidia-plans-to-add-g-sync-technology-to-laptops/feed/ 3
Gigabyte sells GTX 980 WaterForce 3-way SLI kit for £2499/$2999 https://www.kitguru.net/components/graphic-cards/anton-shilov/gigabyte-starts-to-sell-geforce-gtx-980-waterforce-3-way-sli-kit-for-24992999/ https://www.kitguru.net/components/graphic-cards/anton-shilov/gigabyte-starts-to-sell-geforce-gtx-980-waterforce-3-way-sli-kit-for-24992999/#comments Thu, 11 Dec 2014 06:03:29 +0000 http://www.kitguru.net/?p=225815 Gigabyte Technology has started to sell what could be the highest-performing retail graphics processing solution ever developed. The Gigabyte GeForce GTX 980 WaterForce 3-way SLI includes three massively factory-overclocked graphics cards that are designed to work in multi-GPU mode as well as a special liquid cooling solution. But extreme performance comes at extreme price. Designed …

The post Gigabyte sells GTX 980 WaterForce 3-way SLI kit for £2499/$2999 first appeared on KitGuru.]]>
Gigabyte Technology has started to sell what could be the highest-performing retail graphics processing solution ever developed. The Gigabyte GeForce GTX 980 WaterForce 3-way SLI includes three massively factory-overclocked graphics cards that are designed to work in multi-GPU mode as well as a special liquid cooling solution. But extreme performance comes at extreme price.

Designed to deliver ultimate framerates in the latest games in ultra-high-definition resolutions, the Gigabyte GeForce GTX 980 WaterForce 3-way SLI (GV-N980X3WA-4GD) graphics board kit features default GPU clock-rates of 1228MHz/1329MHz (base/boost). Each graphics card sports 4GB of GDDR5 memory at 7GHz, three DisplayPort, one HDMI and two DVI outputs.

gigabyte_3way_sli_watercooling_1

Gigabyte’s three-card graphics product is a unique offering that is designed to provide unbeatable performance at price-points already tested by Nvidia Corp. itself with its GeForce GTX Titan Z. While potentially this three-way multi-GPU graphics kit should offer higher performance than the dual-GPU graphics card, it should be noted that far not all video games scale performance with three GPUs. As a consequence, Gigabyte’s solution will not be the absolutely best graphics sub-system possible in all cases. Still, unlike the GTX Titan Z, the GTX 980 WaterForce 3-way SLI kit features high clock-rates and will offer decent levels of performance even in case only one GPU is active, hence, in general the triple-GPU configuration should deliver high performance in almost all games.

Thanks to closed-loop liquid cooling solution, the kit only produces 30.4db of noise while fully-loaded. The 3-way SLI GeForce GTX 980 graphics processing kit runs 42.6 per cent cooler compared to reference design graphics cards while delivering up to 20 per cent higher performance.

gigabyte_3way_sli_watercooling

According to Gigabyte Technology, the WaterForce cooler is the best cooling solution for three-way SLI GeForce GTX 980 graphics sub-system, which can easily be true since there are no off-the-shelf closed-loop liquid cooling kits for three graphics cards. Each graphics board is pre-installed with its own exclusive waterblock and independent radiator connected with a pair of coolant pipes. All three 120mm radiators are placed inside the external metal watercooling box for maximum efficiency. The box sports three 120mm fans.

gigabyte_3way_sli_watercooling_2

The Gigabyte WaterForce liquid cooler features an intuitive control panel with an OLED display panel, where users can monitor and adjust all the settings about the graphics cards, including temperature, fan speed and pump speed. Moreover, the Gigabyte kit features target temperature mode that automatically adjusts fan and pump speeds to achieve user-defined temperatures.

On Wednesday Overclockers UK and Newegg began to sell/take pre-orders on the Gigabyte GeForce GTX 980 WaterForce 3-way SLI graphics cards kit at rather whopping price-points. In the U.K. the 3-way kit costs £2499.99 including VAT, whereas in the U.S. the 3-way graphics sub-system is priced at whopping $2999.99.

The GTX 980 WaterForce 3-way SLI kit (GV-N980X3WA-4GD) will eventually be available in Taiwan, Australia, China, France, Germany, Japan, Korea and Russia beginning this month.

Discuss on our Facebook page, HERE.

KitGuru Says: While the monstrous GTX 980 WaterForce 3-way SLI kit is designed to deliver ultimate gaming performance for hardcore gamers, it comes at a price that by far exceeds the cost of three GeForce GTX 980 graphics adapters. In fact, the three-way liquid cooling solution costs rather whopping $1350/£1300, an extreme price for a cooler.

The post Gigabyte sells GTX 980 WaterForce 3-way SLI kit for £2499/$2999 first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/anton-shilov/gigabyte-starts-to-sell-geforce-gtx-980-waterforce-3-way-sli-kit-for-24992999/feed/ 7
OCUK offering free SLI upgrade on Titan Z Battleboxes https://www.kitguru.net/desktop-pc/matthew-wilson/ocuk-offering-free-sli-upgrade-on-titan-z-battleboxes/ https://www.kitguru.net/desktop-pc/matthew-wilson/ocuk-offering-free-sli-upgrade-on-titan-z-battleboxes/#comments Wed, 13 Aug 2014 12:36:45 +0000 http://www.kitguru.net/?p=207199 Overclockers UK is looking to sweeten the deal for those of you who are looking to drop about four grand on a new PC by offering a free SLI upgrade on Nvidia's Titan Z Battlebox 4K gaming systems. The Battlebox Titan Z Extreme edition costs £3999 and includes an Intel Core i7 4790k CPU, overclocked …

The post OCUK offering free SLI upgrade on Titan Z Battleboxes first appeared on KitGuru.]]>
Overclockers UK is looking to sweeten the deal for those of you who are looking to drop about four grand on a new PC by offering a free SLI upgrade on Nvidia's Titan Z Battlebox 4K gaming systems.

The Battlebox Titan Z Extreme edition costs £3999 and includes an Intel Core i7 4790k CPU, overclocked to 4.6 GHz, running on an Asus Z97 Maximus VII Formula motherboard. In addition, the system can support up to 32GB of 2400 MHz RAM and your choice of primary and secondary drive- the default option is a 256GB Samsung 840 Evo coupled with a 2TB Seagate Barracuda. The parts are all packed inside a Phanteks Enthoo Pro Mid Tower case.

NVIDIA-GeForce-GTX-Titan-Z-Official

There is a second system on offer- the Battlebox Titan Z Hydro edition. This system costs £4999 and features all of the same options as the non hydro system. However, as the name suggests, this system comes with water cooling on the CPU, which allows the 4790k to be overclocked to 4.7GHz.

The Titan Z released earlier this year, featuring 12GB of VRAM and two fully unlocked Titan Black cores on a single PCB. The Titan Z usually costs £2000 on its own, which actually makes these Battlebox systems quite good for value considering that you'll have two of them. You'll also get two free games, GRID AutoSport and Borderlands the Pre-sequel.

It's needless to say that these systems are aimed at those who already own or plan on owning a 4K monitor and want to game with the details turned up. This offer comes in light of recent rumors that Nvidia are due to launch the GTX 800 series next month- this could be Nvidia's way of trying to clear stock before the new cards are announced. However, this is all highly speculative.

Discuss on our Facebook page, HERE.

KitGuru Says: These systems are very expensive but the offer is actually very good, especially if you were planning a huge upgrade. Do any of our readers own a Titan Z already? How many of you game on a 4K monitor? 

The post OCUK offering free SLI upgrade on Titan Z Battleboxes first appeared on KitGuru.]]>
https://www.kitguru.net/desktop-pc/matthew-wilson/ocuk-offering-free-sli-upgrade-on-titan-z-battleboxes/feed/ 1
Nvidia GeForce GTX Titan Z is finally available. But what about reviews? https://www.kitguru.net/components/graphic-cards/anton-shilov/nvidia-geforce-gtx-titan-z-is-finally-available-but-what-about-its-reviews/ https://www.kitguru.net/components/graphic-cards/anton-shilov/nvidia-geforce-gtx-titan-z-is-finally-available-but-what-about-its-reviews/#comments Thu, 05 Jun 2014 22:41:46 +0000 http://www.kitguru.net/?p=196827 It has been a week since Nvidia Corp. and its partners finally started to sell the GeForce GTX Titan Z graphics cards powered by two GK110 graphics processing units. The graphics solution costs whopping $2999 in the U.S., £2399.99 in the UK and around €3000 in Europe. While the price and technical specifications of the …

The post Nvidia GeForce GTX Titan Z is finally available. But what about reviews? first appeared on KitGuru.]]>
It has been a week since Nvidia Corp. and its partners finally started to sell the GeForce GTX Titan Z graphics cards powered by two GK110 graphics processing units. The graphics solution costs whopping $2999 in the U.S., £2399.99 in the UK and around €3000 in Europe. While the price and technical specifications of the product are known, an interesting thing is that almost nobody knows how exactly it performs. The reason? There are no reviews of the Titan Z! Not a single one.

When the board was announced at the GPU Technology Conference in March, it was proclaimed to be the world’s highest-performing consumer-class graphics solution that features two most complex graphics processing units ever made as well as 12GB of fast GDDR5 memory. Thanks to 5760 stream processors in total, the Titan Z card features compute performance of over 8TFLOPS, which, at the time, was outstanding. The GeForce GTX Titan Z was not expected to have any rivals, so the $3000 price-tag was justified, at least to a certain degree.

nvidia_geforce_gtx_titan_z_1

But just days after Nvidia demonstrated its Titan Z, the first rumours about specifications of a dual-chip graphics card from AMD – the Radeon R9 295X2 – started to show up. The Radeon R9 295X2 was formally launched on the 8th of April and was released commercially on the 21st of April, 2014, about two months ahead of the GeForce GTX Titan Z. But a bigger problem for Nvidia was that the Radeon R9 295X2 offered 11TFLOPS of compute power and potentially higher performance in games.

The reviews of the R9 295X2 clearly demonstrated that a couple of the GeForce GTX Titan Black graphics cards in SLI easily outperform AMD’s dual-chip flagship. However, performance of the GTX Titan Z was projected to be lower than that of two Titan Blacks in SLI. Therefore, it was unclear whether the GTX Titan Z would be faster compared to the Radeon R9 295X2 in video games or not.

The GeForce GTX Titan Z was expected to emerge on the market in late April, but it did not. Unofficial sources close to Nvidia said that the company was refining its drivers in order to beat the Radeon R9 295X2 and justify its higher price.

The GeForce GTX Titan Z graphics cards are finally available now, the new drivers with improved SLI multi-GPU scalability are here as well. Yet, we still have no idea whether the Titan Z is faster or slower compared to the Radeon R9 295X2 due to lack of reviews from mainstream web-sites. Nvidia simply does not provide the cards for testing.

nvidia_geforce_gtx_titan_z

The most interesting thing is the reason why there are no review samples of the GeForce GTX Titan Z. According to Nvidia representatives responsible for reviews, the GTX Titan Z is… not a gaming graphics card. While the board is clearly more than just a graphics adapter for gamers, since it supports double-precision computations and, perhaps, a few other things not present on typical gaming solutions, Nvidia’s own web-site states that the “Titan Z is engineered for next-generation 5K and multi-monitor gaming”.

The GeForce GTX Titan Z is supposed to offer great performance in video games. We are not really sure about its thermals and noise (since it is cooled down by a single fan), but when it comes to performance, it should be good. Will it beat the Radeon R9 295X2? In many cases it most probably will, in other cases it will not (check out the Titan Z's preliminary test results here). The main problem is that the graphics card costs an extreme amount of money for a consumer solution. It costs two times more than its main rival, the Radeon R9 295X2 (which also offers features like double-precision computing, etc.), but it obviously does not offer two times higher performance.

Discuss on our Facebook page, HERE.

KitGuru Says: Back in the days when Nvidia unveiled its first 4-way multi-GPU SLI technology it also officially positioned it as an exclusive solution for boutique PC makers. The first independent reviews showed that there were problems with multi-GPU scaling, compatibility as well as stability. Eventually Nvidia corrected the issues and the quad SLI is a rather popular high-end technology nowadays. We are not sure that anything is wrong about the Titan Z, except its price…

The post Nvidia GeForce GTX Titan Z is finally available. But what about reviews? first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/anton-shilov/nvidia-geforce-gtx-titan-z-is-finally-available-but-what-about-its-reviews/feed/ 6
The Titan Z has not been cancelled or delayed https://www.kitguru.net/components/graphic-cards/matthew-wilson/the-titan-z-is-not-cancelled/ https://www.kitguru.net/components/graphic-cards/matthew-wilson/the-titan-z-is-not-cancelled/#respond Wed, 21 May 2014 11:16:11 +0000 http://www.kitguru.net/?p=194292 Nvidia revealed the $3000 Titan Z graphics card all the way back in March, which was some time ago. Since then,  the company has been pretty quiet on its dual graphics solution, which sparked rumours of cancellation or an indefinite delay. However, it seems the rumours weren't true as Nvidia's CEO has cleared things up. …

The post The Titan Z has not been cancelled or delayed first appeared on KitGuru.]]>
Nvidia revealed the $3000 Titan Z graphics card all the way back in March, which was some time ago. Since then,  the company has been pretty quiet on its dual graphics solution, which sparked rumours of cancellation or an indefinite delay. However, it seems the rumours weren't true as Nvidia's CEO has cleared things up.

In an interview with Cnet, Nvidia CEO, Jen-Hsun Huang, was asked if there was any truth to the Titan Z being cancelled or delayed, to which he replied: “No, no, that's silliness.”

nvidia_geforce_gtx_titan_z_huang

Even though the Titan Z was revealed almost two months ago, according to Huang, it is still on track. He didn't explain why the card is taking so long to release but he did explain the thinking behind its monstrous $3000 price tag, stating that most Titan customers re-buy every year:

“Most of the customers that buy Titan Zs buy it every year,” Huang said. “The reason for that is the people who buy Titans and Titan Zs have an insatiable need for computing capability, graphics computing capability. So either they got tired of using just a 1,080p monitor and they just bought a 4K. My Titan all of a sudden's not enough. For a 4K monitor, a $3,000 to $5,000 monitor, I need something bigger to drive it. So that's Titan Z.”

There was no hint as to when the Titan Z will actually release but apparently it is still coming, will still cost $3000 and hasn't been delayed at all.

Discuss on our Facebook page, HERE.

KitGuru Says: We don't have any reliable performance numbers for the Titan Z just yet so its hard to tell if it will actually be worth the huge price tag, even when compared to something like the R9 295x. The long wait for Nvidia's dual GPU card might also put some potential buyers off as many enthusiasts are going to be waiting for the release of the Maxwell architecture, which is rumoured to launch later this year. 

Source: Cnet

The post The Titan Z has not been cancelled or delayed first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/matthew-wilson/the-titan-z-is-not-cancelled/feed/ 0