directx 12 | KitGuru https://www.kitguru.net KitGuru.net - Tech News | Hardware News | Hardware Reviews | IOS | Mobile | Gaming | Graphics Cards Wed, 18 Oct 2023 07:55:02 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.3 https://www.kitguru.net/wp-content/uploads/2021/06/cropped-KITGURU-Light-Background-SQUARE2-32x32.png directx 12 | KitGuru https://www.kitguru.net 32 32 AMD enables Frame Generation for all DX 11/12 games with new driver https://www.kitguru.net/gaming/joao-silva/amd-enables-frame-generation-for-all-dx-11-12-games-with-new-driver/ https://www.kitguru.net/gaming/joao-silva/amd-enables-frame-generation-for-all-dx-11-12-games-with-new-driver/#respond Tue, 17 Oct 2023 15:15:46 +0000 https://www.kitguru.net/?p=634230 AMD's Fluid Motion Frames technology is still in the preview phase. The new Frame Generation technique initially kicked off with support for a handful of titles but the latest Preview Driver expands things greatly, allowing the feature to be enabled for all DX11 and DX12 games.  The most significant change is that AFMF may now …

The post AMD enables Frame Generation for all DX 11/12 games with new driver first appeared on KitGuru.]]>
AMD's Fluid Motion Frames technology is still in the preview phase. The new Frame Generation technique initially kicked off with support for a handful of titles but the latest Preview Driver expands things greatly, allowing the feature to be enabled for all DX11 and DX12 games. 

The most significant change is that AFMF may now be activated for all DirectX11 and DirectX12 games using HYPR-RX, or the AMD Fluid Motion Toggle, marking a major milestone for the technology. As such, AMD's Frame Generation technology now has more game support than Nvidia's DLSS 3.

Here are the full patch notes for the AMD Software: Adrenalin Edition Technical Preview Driver for AMD Fluid Motion Frames Version 23.30.01.02:

  • We have added initial support for HDR to expand the AFMF gaming experience.
  • After monitoring user experience feedback, AFMF can now be globally enabled on all DirectX® 11 and 12 titles.
    • Users may use the per-app settings to individually disable or enable AFMF.
  • Improvements to frame pacing have been made, resulting in an overall smoother gameplay experience and improved image quality.
  • Additional game support for AMD Radeon™ Anti-Lag+
    • MechWarrior 5: Mercenaries
    • Deep Rock Galactic
    • Warhammer 40,000: Darktide
    • Sniper Elite 5
    • Returnal
    • Remnant II
    • Spider-Man: Miles Morales
    • Spider-Man Remastered
    • PUBG: BATTLEGROUNDS
    • Call of Duty: Modern Warfare ® II
    • Tiny Tina's Wonderland
    • Hogwarts Legacy
    • Resident Evil 3

The list shows that Anti-Lag+ is now supported on a few new games as well, but note that the technology still has some issues in online gaming, as we saw recently with Counter-Strike 2 players triggering VAC bans. Those interested in the new preview driver can download it HERE.

Discuss on our Facebook page, HERE.

KitGuru says: Have you already tried the new driver? How was your experience with Frame Generation enabled?

The post AMD enables Frame Generation for all DX 11/12 games with new driver first appeared on KitGuru.]]>
https://www.kitguru.net/gaming/joao-silva/amd-enables-frame-generation-for-all-dx-11-12-games-with-new-driver/feed/ 0
Nvidia’s latest graphics driver increases performance in DX12 games by up to 24% https://www.kitguru.net/components/graphic-cards/joao-silva/nvidias-latest-graphics-driver-increases-performance-in-dx12-games-by-up-to-24/ https://www.kitguru.net/components/graphic-cards/joao-silva/nvidias-latest-graphics-driver-increases-performance-in-dx12-games-by-up-to-24/#respond Thu, 13 Oct 2022 11:30:03 +0000 https://www.kitguru.net/?p=578416 This week's Nvidia graphics driver update not only adds support for the RTX 4090, but also makes some new optimisations, improving DX12 performance for RTX 30 series GPUs by up to 24 percent.  While a lot of games still offer a DX11 version, the majority of titles now support DX12. The new GeForce Game Ready …

The post Nvidia’s latest graphics driver increases performance in DX12 games by up to 24% first appeared on KitGuru.]]>
This week's Nvidia graphics driver update not only adds support for the RTX 4090, but also makes some new optimisations, improving DX12 performance for RTX 30 series GPUs by up to 24 percent. 

While a lot of games still offer a DX11 version, the majority of titles now support DX12. The new GeForce Game Ready 522.25 WHQL driver significantly boosts the RTX 30 GPUs in several titles and introduces new shader compilation optimisations, reduced CPU overhead and Re-Bar support for games such as Forza Horizon 5 or F1 22. With these enhancements, Nvidia hopes to reduce the occurrence of CPU-bound scenarios, giving room for the GPU to work harder.

The performance gains vary depending on the title. Nvidia only shared performance figures for AC: Valhalla, Cyberpunk 2077 (RT+DLSS) and Forza Horizon, but there are other titles with noticeable performance gains. The full list can be found below:

  • Assassin's Creed Valhalla: up to 24% (1080p)
  • Battlefield 2042: up to 7% (1080p)
  • Borderlands 3: Up to 8% (1080p)
  • Call of Duty: Vanguard: up to 12% (4K)
  • Control: up to 6% (4K)
  • Cyberpunk 2077: up to 20% (1080p)
  • F1 22: up to 17% (4K)
  • Far Cry 6: up to 5% (1440p)
  • Forza Horizon 5: up to 8% (1080P)
  • Horizon Zero Dawn: Complete Edition: up to 8% (4k)
  • Red Dead Redemption 2: up to 7% (1080p)
  • Shadow of the Tomb Raider: up to 5% (1080p)
  • Tom Clancy’s The Division 2: up to 5% (1080p)
  • Watch Dogs: Legion: up to 9% (1440p)

The RTX 4090 isn't the only GPU receiving support with the 522.25 Game Ready driver. VideoCardz also found three new RTX 30 IDs in the “nv_dispig.inf” file that were not mentioned in the official patch notes. One of them is a GA102-based RTX 3070 Ti. The others are a new RTX 3060 Ti based on the GA104 and a new GA106-based RTX 3060. These new cards could be the same as the ones mentioned in recent rumours regarding new RTX 30 series cards with GDDR6X memory.

Discuss on our Facebook page, HERE.

Kitguru says: Have you already tried the new driver? Did you notice any performance gains in any DX12-based game you have on your PC?

The post Nvidia’s latest graphics driver increases performance in DX12 games by up to 24% first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/joao-silva/nvidias-latest-graphics-driver-increases-performance-in-dx12-games-by-up-to-24/feed/ 0
Intel drops DX9 support for Arc and Xe GPUs in favour of an open-source workaround https://www.kitguru.net/components/graphic-cards/joao-silva/intel-drops-dx9-support-for-arc-and-xe-gpus-in-favour-of-an-open-source-workaround/ https://www.kitguru.net/components/graphic-cards/joao-silva/intel-drops-dx9-support-for-arc-and-xe-gpus-in-favour-of-an-open-source-workaround/#respond Mon, 15 Aug 2022 09:10:18 +0000 https://www.kitguru.net/?p=570822 Intel appears to be having trouble with DX9 on its new, modern GPUs. As a result, Intel is changing things up and will drop DirectX 9 support on Arc and Xe GPUs, and will instead use an open-source mapping layer to emulate DX9 using DX12. 

The post Intel drops DX9 support for Arc and Xe GPUs in favour of an open-source workaround first appeared on KitGuru.]]>
Intel appears to be having trouble with DX9 on its new, modern GPUs. As a result, Intel is changing things up and will drop DirectX 9 support on Arc and Xe GPUs, and will instead use an open-source mapping layer to emulate DX9 using DX12. 

Intel shared the news on its support page, noting that if you're running an 11th Gen or older Intel processor with an Intel Arc graphics card, DX9-based apps will still work, but only if run on the processor's integrated graphics. Still, in these cases, chances are that you'll be using the Arc desktop card to display and render content. Therefore, the system will use DX9On12 instead of DX9.

In summary, DX9On12 implements the D3D9 usermode DDI (device driver interface) to map commands from D3D9 to D3D12 using a conversion layer. In comparison, hardware supporting DirectX 9 sends the commands directly into the D3D9 driver.

Generally speaking, you shouldn't notice any loss of performance. If you find a specific case where performance is noticeably worse using DX9OnDX12, you can always report it on Github so the developers can fix it.

Discuss on our Facebook page, HERE.

KitGuru says: Now that Intel has decided to drop support for native DX9 on its latest graphic cards and iGPUs, it will be interesting to see if Nvidia and AMD will follow suit.

The post Intel drops DX9 support for Arc and Xe GPUs in favour of an open-source workaround first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/joao-silva/intel-drops-dx9-support-for-arc-and-xe-gpus-in-favour-of-an-open-source-workaround/feed/ 0
Intel disables DirectX 12 API on multiple 4th Gen Core processors https://www.kitguru.net/components/cpu/joao-silva/intel-disables-directx-12-api-on-multiple-4th-gen-core-processors/ https://www.kitguru.net/components/cpu/joao-silva/intel-disables-directx-12-api-on-multiple-4th-gen-core-processors/#respond Mon, 08 Nov 2021 13:33:22 +0000 https://www.kitguru.net/?p=538843 Intel's 4th Gen Core “Haswell” processors are around eight years old at this point but continue to receive updates. This time around, a security bug was discovered and unfortunately, this has led to Intel disabling support for the DirectX 12 API in most Haswell based processors for desktops and laptops.  According to Intel, iGPUs on …

The post Intel disables DirectX 12 API on multiple 4th Gen Core processors first appeared on KitGuru.]]>
Intel's 4th Gen Core “Haswell” processors are around eight years old at this point but continue to receive updates. This time around, a security bug was discovered and unfortunately, this has led to Intel disabling support for the DirectX 12 API in most Haswell based processors for desktops and laptops. 

According to Intel, iGPUs on 4th Generation Intel Core processors may allow escalation of privilege. Starting with the graphics driver version 15.40.44.5107 and onwards, support for DirectX 12 API will be disabled to prevent any damage resulting from exploiting this vulnerability.

By disabling the DirectX 12 API, Intel 4th Gen Core processors users with this driver or a newer one won't be able to run DirectX 12-based games and applications. Those using a dedicated graphics card alongside the affected 4th Gen CPUs should still be able to run DirectX 12-based apps. Some processors affected by the vulnerability include best-sellers like the Core i5-4690K and the i7-4790K. The complete list is as follows:

  • 4th Generation Intel Core Processors with Intel Iris Pro Graphics 5200
  • 4th Generation Intel Core Processors with Intel Iris Graphics 5100
  • 4th Generation Intel Core Processors with Intel HD Graphics 5000/4600/4400/4200
  • Intel Pentium and Celeron Processors with Intel HD Graphics based on 4th Generation Intel Core

If you wish to keep running DirectX 12 APU-based apps with the processors' iGPU, Intel recommends users run them using the graphics driver version 15.40.42.5063 or older.

KitGuru says: Are you currently running a system equipped with an Intel processor affected by this vulnerability? 

The post Intel disables DirectX 12 API on multiple 4th Gen Core processors first appeared on KitGuru.]]>
https://www.kitguru.net/components/cpu/joao-silva/intel-disables-directx-12-api-on-multiple-4th-gen-core-processors/feed/ 0
Microsoft is bringing Xbox Series X/S ‘Auto HDR’ feature to PC https://www.kitguru.net/gaming/matthew-wilson/microsoft-is-bringing-xbox-series-x-s-auto-hdr-feature-to-pc/ https://www.kitguru.net/gaming/matthew-wilson/microsoft-is-bringing-xbox-series-x-s-auto-hdr-feature-to-pc/#respond Thu, 18 Mar 2021 14:30:04 +0000 https://www.kitguru.net/?p=508295 One of Microsoft's big features for backwards compatible games on Xbox Series X/S consoles has been Auto HDR, enabling High Dynamic Range across a number of SDR-only games. Now, PC gamers are also going to benefit, with Microsoft preparing to enable Auto HDR for over 1,000 PC games. Auto HDR will be enabled in both …

The post Microsoft is bringing Xbox Series X/S ‘Auto HDR’ feature to PC first appeared on KitGuru.]]>
One of Microsoft's big features for backwards compatible games on Xbox Series X/S consoles has been Auto HDR, enabling High Dynamic Range across a number of SDR-only games. Now, PC gamers are also going to benefit, with Microsoft preparing to enable Auto HDR for over 1,000 PC games.

Auto HDR will be enabled in both DirectX 11 and DirectX 12 games. DirectX Program Manager, Hannah Fisher, explained the benefits of Auto HDR in a developer blog post:

“While some game studios develop for HDR gaming PCs by mastering their game natively for HDR, Auto HDR for PC will take DirectX 11 or DirectX 12 SDR-only games and intelligently expand the colour/brightness range up to HDR. It’s a seamless platform feature that will give you an amazing new gaming experience that takes full advantage of your HDR monitor’s capabilities.”

In an example image (seen above), we can see how Auto HDR impacts the luminance in a seen from Gears 5. Of course, Gears 5 already has native HDR support, so while Auto HDR doesn't bring the same level of colour detail, it gets quite close. In games that don't support HDR at all, Auto HDR can make an impressive difference.

Currently, Auto HDR is in preview, available to Windows Insider build testers. Since the feature is still in testing, there are some bugs to work out and there will be additional optimisation, as Auto HDR does use some GPU compute power. Just a few games support the featue for now, but as testing continues, more games will be added, with plans to enable Auto HDR across the top 1,000 DX 11 and DX 12 titles.

Discuss on our Facebook page, HERE.

KitGuru Says: If you have an HDR-capable monitor and happen to be a Windows Insider, then this is worth checking out. Auto HDR works well on the Xbox Series X, so it will be interesting to compare that experience to the same feature on PC.

The post Microsoft is bringing Xbox Series X/S ‘Auto HDR’ feature to PC first appeared on KitGuru.]]>
https://www.kitguru.net/gaming/matthew-wilson/microsoft-is-bringing-xbox-series-x-s-auto-hdr-feature-to-pc/feed/ 0
Early Nvidia RTX 3060 benchmarks appear in Ashes of the Singularity database https://www.kitguru.net/components/graphic-cards/joao-silva/early-nvidia-rtx-3060-benchmarks-appear-in-ashes-of-the-singularity-database/ https://www.kitguru.net/components/graphic-cards/joao-silva/early-nvidia-rtx-3060-benchmarks-appear-in-ashes-of-the-singularity-database/#respond Tue, 23 Feb 2021 16:22:25 +0000 https://www.kitguru.net/?p=505232 The RTX 3060 is just a couple of days away from launch and as a result, early performance benchmarks are being posted fairly frequently. The latest tests show RTX 3060 performance in Ashes of the Singularity, giving us a glimpse at how the new card compares to its predecessor, as well as other GPUs in …

The post Early Nvidia RTX 3060 benchmarks appear in Ashes of the Singularity database first appeared on KitGuru.]]>
The RTX 3060 is just a couple of days away from launch and as a result, early performance benchmarks are being posted fairly frequently. The latest tests show RTX 3060 performance in Ashes of the Singularity, giving us a glimpse at how the new card compares to its predecessor, as well as other GPUs in the RTX 30 series line-up. 

The latest RTX 3060 benchmark comes our way via @leakbench, who shared the first RTX 3060 score found on the Ashes of the Singularity database. Since then, more entries have popped up using the same GPU, but for this post, we'll be focusing on the benchmark using DX12 and the 1080p ‘Crazy' graphics preset. During this test, the RTX 3060 was paired with an Intel Core i7-8700K and 32GB of RAM.

Compared to the average RTX 2060 score, the RTX 3060 appears to be 20 percent faster. In-fact, the new RTX 3060 appears to beat out most RTX 2070 models, putting it in a similar performance bracket to the RTX 2070 Super.

Nvidia will be launching the RTX 3060 on the 25th of February. MSRP pricing is set at £299/$329, although custom-cooled GPUs from board partners will likely cost considerably more.

KitGuru says: Are any of you planning on upgrading your GPU this year? Are you planning on picking up an RTX 3060? 

The post Early Nvidia RTX 3060 benchmarks appear in Ashes of the Singularity database first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/joao-silva/early-nvidia-rtx-3060-benchmarks-appear-in-ashes-of-the-singularity-database/feed/ 0
NVIDIA RTX supported games don’t use ‘proprietary’ tech https://www.kitguru.net/components/graphic-cards/matthew-wilson/nvidia-rtx-supported-games-dont-use-proprietary-tech/ https://www.kitguru.net/components/graphic-cards/matthew-wilson/nvidia-rtx-supported-games-dont-use-proprietary-tech/#respond Thu, 12 Nov 2020 12:29:52 +0000 https://www.kitguru.net/?p=493829 With AMD entering the ray-tracing capable GPU market this month, there has been some confusion around what games will be supported and whether or not previous RTX-supported titles are using proprietary technology for ray-tracing. Fortunately, Nvidia has begun clearing the air. 

The post NVIDIA RTX supported games don’t use ‘proprietary’ tech first appeared on KitGuru.]]>
With AMD entering the ray-tracing capable GPU market this month, there has been some confusion around what games will be supported and whether or not previous RTX-supported titles are using proprietary technology for ray-tracing. Fortunately, Nvidia has begun clearing the air. 

In a recent interview with AdoredTV, an AMD spokesperson said that it won't support “games making use of proprietary raytracing APIs and extensions”. This caused a stir and some assumptions that RTX-supported titles are using a proprietary Nvidia API or extension. This is not the case though, as the vast majority of ray-tracing titles use Microsoft's DirectX Raytracing, itself an extension of DX12, which is open to all platforms.

Speaking with wccftech, Nvidia's Brian Burke explained that there are just three games on the market that aren't using Microsoft's DXR technology – Quake II RTX, Wolfenstein: Youngblood and JX3. These three games use an Nvidia ray-tracing extension for the Vulkan API, simply as a workaround while the official Vulkan Ray Tracing extension is being worked on.

Both Nvidia and AMD GPUs with ray-tracing acceleration will be able to use the current industry standard, which is Microsoft's DXR. This clarifies previous confusion around Cyberpunk 2077, which was incorrectly claimed to only support ray tracing on Nvidia GPUs, but as it uses DXR, ray tracing will work on AMD's RX 6000 series cards. Nvidia also claims there is also nothing stopping AMD from creating its own RT-extension for Vulkan while the Khronos Group finalises its official extension for industry-wide use.

Lastly, AMD could also potentially use Nvidia's Vulkan ray-tracing extension as well. This is something that Intel has been considering itself for future Xe HPG GPUs, as pointed out by Tom's Hardware. Nvidia has spoken to us about this and made it clear that this extension is not ‘proprietary' in the sense that it is open for other manufacturers to use. Nvidia is adamant AMD could implement support for its Vulkan extension should it wish to.

We reached out to AMD to get a statement on this and a spokesperson told us the following: “AMD will support all raytracing titles using industry-based standards, including the Microsoft DXR API and the upcoming Vulkan raytracing API. Games making of use of proprietary raytracing APIs and extensions will not be supported.”

KitGuru Says: Hopefully this clears up any confusion. Rest assured, when RX 6000 series GPUs launch, they should be capable of ray-tracing in most games that include it. 

The post NVIDIA RTX supported games don’t use ‘proprietary’ tech first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/matthew-wilson/nvidia-rtx-supported-games-dont-use-proprietary-tech/feed/ 0
Microsoft launches DirectX 12 Ultimate to enable future-proof graphics features for next-gen https://www.kitguru.net/components/graphic-cards/matthew-wilson/microsoft-launches-directx-12-ultimate-to-enable-future-proof-graphics-features-for-next-gen/ https://www.kitguru.net/components/graphic-cards/matthew-wilson/microsoft-launches-directx-12-ultimate-to-enable-future-proof-graphics-features-for-next-gen/#respond Thu, 19 Mar 2020 19:02:13 +0000 https://www.kitguru.net/?p=457690 The DirectX 12 API has been around for most of the current gaming generation but as we look towards the future, an update is due. Today, Microsoft announced DirectX 12 Ultimate, labelling it as the “best graphics technology” it has put out to date, which also reaches an “unprecedented alignment” between PC and the upcoming …

The post Microsoft launches DirectX 12 Ultimate to enable future-proof graphics features for next-gen first appeared on KitGuru.]]>
The DirectX 12 API has been around for most of the current gaming generation but as we look towards the future, an update is due. Today, Microsoft announced DirectX 12 Ultimate, labelling it as the “best graphics technology” it has put out to date, which also reaches an “unprecedented alignment” between PC and the upcoming Xbox Series X console.

The key reason for this update is to push forward new graphics technologies that will play a bigger role next-gen. This includes the likes of Raytracing, Variable Rate Shading, Mesh Shaders, Sampler Feedback and more. According to Microsoft, the upgrade to DX12 Ultimate ensures “future-proof feature support for next generation games”.

The new DirectX 12 Ultimate API is just part of the story though, Microsoft also has new tools to help developers make the most out of these new features. This includes the PIX graphics optimisation tool to enable developers to save as much performance as possible across PCs and console. There is also the open-source HLSL compiler, which will also aid in this mission to enable next-gen graphics and unlocking the performance to run these effects properly.

Graphics cards that already support hardware accelerated raytracing, like the GeForce RTX series, should already be certifiable for DirectX 12 Ultimate support. On the AMD side, future RDNA 2 GPUs, like the ones appearing in the Xbox Series X and PlayStation 5, will support the latest version of DirectX and all of its features.

KitGuru Says: This is somewhat expected, as a new generation often calls for enhanced tools and APIs to give developers more to work with. Hopefully over the next few months, we'll start to see the first games showcasing these new graphical features too, both for console and PC.

The post Microsoft launches DirectX 12 Ultimate to enable future-proof graphics features for next-gen first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/matthew-wilson/microsoft-launches-directx-12-ultimate-to-enable-future-proof-graphics-features-for-next-gen/feed/ 0
3DMark updates with Tier 2 Variable-Rate Shading benchmark https://www.kitguru.net/components/graphic-cards/matthew-wilson/3dmark-updates-with-tier-2-variable-rate-shading-benchmark/ https://www.kitguru.net/components/graphic-cards/matthew-wilson/3dmark-updates-with-tier-2-variable-rate-shading-benchmark/#respond Thu, 05 Dec 2019 14:03:19 +0000 https://www.kitguru.net/?p=440114 DirectX 12 was recently updated with Variable-Rate Shading (VRS), giving developers the ability to improve performance by reducing the level of details in parts of a frame where you are unlikely to notice an impact on image quality. In August, UL Benchmarks added a VRS-specific test to 3DMark, utilising Tier-1 variable rate shading. This week, …

The post 3DMark updates with Tier 2 Variable-Rate Shading benchmark first appeared on KitGuru.]]>
DirectX 12 was recently updated with Variable-Rate Shading (VRS), giving developers the ability to improve performance by reducing the level of details in parts of a frame where you are unlikely to notice an impact on image quality. In August, UL Benchmarks added a VRS-specific test to 3DMark, utilising Tier-1 variable rate shading. This week, the benchmark is being updated again, adding in support for Tier-2 VRS.

If you aren't caught up on Variable-Rate Shading, here is a quick explanation for how it works:

The shading rate is the number of pixel shader operations called for each pixel. Higher shading rates are taxing on the GPU but create a more detailed image, while lower shading rates improve performance but lack in visual fidelity. Much like variable refresh rate on monitors, variable rate shading allows the rate to be dynamically adjusted based on GPU performance.

There are two tiers of VRS support in DirectX 12. With Tier 1, developers can specify a shading rate for each draw call. With Tier 2, there is more flexibility, allowing different shading rates within each draw call. In the updated 3DMark VRS benchmark, Tier 2 VRS is used to lower shading rates in areas where there is less contrast between neighbouring pixels. For instance, areas masked under shadow will have a lower shading rate compared to parts of the scene that are fully illuminated.

In the gallery above, you can see screenshots with Tier 2 VRS on, off and then the third image highlights the areas where different shading rates are applied. Purple highlights are areas with the lowest shading rate, followed by pink, orange, yellow and finally, white.

The Tier 2 VRS 3DMark benchmark is available in 3DMark Advanced Edition and Professional Edition. While the Tier 1 VRS test is compatible with either Nvidia Turing based GPUs and Intel Ice Lake processors, Tier 2 VRS is limited to Nvidia Turing GPUs. You also need to be running Windows 10 version 1903 or newer.

Discuss on our facebook page HERE.

KitGuru Says: Variable Rate Shading is very interesting tech with potential for big performance gains in PC games. Hopefully we'll have a chance to give this new benchmark update a few runs later in the month. 

The post 3DMark updates with Tier 2 Variable-Rate Shading benchmark first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/matthew-wilson/3dmark-updates-with-tier-2-variable-rate-shading-benchmark/feed/ 0
Microsoft unlocks DirectX 12 support on Windows 7 https://www.kitguru.net/components/graphic-cards/matthew-wilson/microsoft-unlocks-directx-12-support-on-windows-7/ https://www.kitguru.net/components/graphic-cards/matthew-wilson/microsoft-unlocks-directx-12-support-on-windows-7/#respond Wed, 13 Mar 2019 12:58:33 +0000 https://www.kitguru.net/?p=406660 Since the launch of the DirectX 12 API, Microsoft has kept it exclusive to Windows 10. At this point, while there are 800 million Windows 10 devices out in the wild, there are still a good chunk of users running Windows 7. Now, those people will get access to DirectX 12 too. DirectX 12 support …

The post Microsoft unlocks DirectX 12 support on Windows 7 first appeared on KitGuru.]]>
Since the launch of the DirectX 12 API, Microsoft has kept it exclusive to Windows 10. At this point, while there are 800 million Windows 10 devices out in the wild, there are still a good chunk of users running Windows 7. Now, those people will get access to DirectX 12 too.

DirectX 12 support for Windows 7 has arrived, in part thanks to a request from Blizzard. Last year, Blizzard updated World of Warcraft with better multi-threaded CPU performance. The studio wanted to improve things further by allowing Windows 7 users to use DirectX 12, a request that Microsoft decided to grant.

 

In a post on the matter, Microsoft stated: “when we received this feedback from Blizzard and other developers, we decided to act on it. Microsoft is pleased to announce that we have ported the user mode D3D12 runtime to Windows 7. This unblocks developers who want to take full advantage of the latest improvements in D3D12 while still supporting customers on older operating systems.”

Those on Windows 7 should probably still consider upgrading though, as the near decade-old OS is nearing its end of life.

KitGuru Says: Going forward, developers will be able to support DirectX 12 on both Windows 7 and Windows 10. Although with Windows 7 support coming to an end, it is probably a good idea to upgrade anyway. Are any of you still running Windows 7?

The post Microsoft unlocks DirectX 12 support on Windows 7 first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/matthew-wilson/microsoft-unlocks-directx-12-support-on-windows-7/feed/ 0
The world’s first dedicated DX12/4K benchmark is available starting from today https://www.kitguru.net/components/graphic-cards/matthew-wilson/the-worlds-first-dedicated-dx124k-benchmark-is-available-starting-from-today/ https://www.kitguru.net/components/graphic-cards/matthew-wilson/the-worlds-first-dedicated-dx124k-benchmark-is-available-starting-from-today/#comments Wed, 11 Oct 2017 10:00:40 +0000 https://www.kitguru.net/?p=349898 One of the most satisfying parts about building a new system is putting it through its paces and seeing how much more performance you are getting. If you are in need of a new benchmark to spice things up though, then Futuremark has you covered, as 3DMark TimeSpy Extreme is rolling out starting today, allowing …

The post The world’s first dedicated DX12/4K benchmark is available starting from today first appeared on KitGuru.]]>
One of the most satisfying parts about building a new system is putting it through its paces and seeing how much more performance you are getting. If you are in need of a new benchmark to spice things up though, then Futuremark has you covered, as 3DMark TimeSpy Extreme is rolling out starting today, allowing you to test your hardware in DirectX 12 and 4K.

TimeSpy Extreme is the world’s first dedicated DirectX 12 4K benchmark. You won’t necessarily need a 4K monitor to run it, but you will need a graphics card with at least 4GB of video memory. This particular benchmark was developed with input from AMD, Intel and Nvidia, alongside members of Futuremark’s Benchmark Development Program. With that in mind, this test should be ideal for all hardware configurations, as well as new high-end GPUs and CPUs with 8 or more cores.

TimeSpy Extreme has had its CPU test redesigned specifically to allow processors with eight or more cores to perform at their full potential. Compared to the original TimeSpy test, the Extreme version is three times more demanding on systems and lets processors use more advanced instruction sets when supported.

3DMark TimeSpy Extreme Time Spy Extreme is available now as a free update for 3DMark Advanced Edition and 3DMark Professional Edition licenses purchased after July 14, 2016. For people who bought 3DMark before July 14, 2016, Time Spy Extreme can be added to 3DMark by purchasing the Time Spy upgrade within the application. Standalone versions of 3DMark will prompt you to download and install the update. On Steam, 3DMark is updated automatically.

KitGuru Says: I tend to run through the 3DMark tests every time I put together a new system or tweak my overclocks. I’m sure many of our PC-building audience have similar habits. Are any of you in the process of putting together a new PC? Will you be checking out TimeSpy Extreme when it's available? 

The post The world’s first dedicated DX12/4K benchmark is available starting from today first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/matthew-wilson/the-worlds-first-dedicated-dx124k-benchmark-is-available-starting-from-today/feed/ 2
The Vulkan API does support multi-GPU, even on older versions of Windows https://www.kitguru.net/components/graphic-cards/matthew-wilson/the-vulkan-api-does-support-multi-gpu-even-on-older-versions-of-windows/ https://www.kitguru.net/components/graphic-cards/matthew-wilson/the-vulkan-api-does-support-multi-gpu-even-on-older-versions-of-windows/#comments Thu, 23 Mar 2017 10:15:11 +0000 http://www.kitguru.net/?p=326523 Over the last week or so, it has been widely reported that the Vulkan API would not support multi GPU setups on older versions of Windows. This topic came to a head when the studio behind Star Citizen announced plans to drop DirectX 12 in favour of Vulkan due to its multi-OS support. It turns …

The post The Vulkan API does support multi-GPU, even on older versions of Windows first appeared on KitGuru.]]>
Over the last week or so, it has been widely reported that the Vulkan API would not support multi GPU setups on older versions of Windows. This topic came to a head when the studio behind Star Citizen announced plans to drop DirectX 12 in favour of Vulkan due to its multi-OS support. It turns out, those reports were inaccurate, as the Kronos Group has come out this week to explain that the Vulkan API will support SLI/Crossfire on Windows 7/8.1 as well as Windows 10 and Linux.

The question of multi GPU support in Vulkan first popped up at GDC earlier this month. Since then, it has been reported that SLI/Crossfire functionality would be tied to Windows 10 but that is not the case. In an updated developer blog, Khronos clarified the situation: “The good news is that the Vulkan multi-GPU specification is very definitely NOT tied to Windows 10. It is possible to implement the Vulkan multi-GPU extension on any desktop OS including Windows 7, 8.X and 10 and Linux.”

There was cause for confusion though. At GDC this year. some Khronos presentations mentioned that Vulkan multi GPU functionality required the Windows Display Driver Model to be in Linked Display Adapter mode. This led to the assumption that multi-GPU setups wouldn't work with Vulkan on Windows 7 or Windows 8.1 but that appears to not be the case.

The use of WDDM is required for Windows operating systems to take advantage of the Vulkan multi GPU extension. However, LDA mode is not explicitly required, it can just make the implementation easier. If a developer does decide to use LDA mode, it is not specifically tied to Windows 10 and can be used on other versions of the OS, including Windows 7 and 8. According to Khronos Group, there are already developers planning to ship games with this level of multi-GPU support included.

KitGuru Says: After talking about Star Citizen's switch from DX12 to Vulkan, we received a ton of comments complaining about the lack of multi-GPU support on older versions of Windows. Fortunately, it looks like the situation was misconstrued this time around, so those sticking to Windows 7 or Windows 8.1 will still benefit from their SLI or Crossfire setups in Vulkan games. Well, as long as developers choose to use Vulkan's multi GPU extension. 

The post The Vulkan API does support multi-GPU, even on older versions of Windows first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/matthew-wilson/the-vulkan-api-does-support-multi-gpu-even-on-older-versions-of-windows/feed/ 3
Star Citizen will drop DX12 for Vulkan to avoid forcing users to upgrade to Windows 10 https://www.kitguru.net/components/graphic-cards/matthew-wilson/star-citizen-will-drop-dx12-for-vulkan-to-avoid-forcing-users-to-upgrade-to-windows-10/ https://www.kitguru.net/components/graphic-cards/matthew-wilson/star-citizen-will-drop-dx12-for-vulkan-to-avoid-forcing-users-to-upgrade-to-windows-10/#comments Mon, 20 Mar 2017 11:11:23 +0000 http://www.kitguru.net/?p=325976 One of the key offerings Microsoft has to try and tempt gamers to upgrade to Windows 10 is DirectX 12 support. The latest API has been picked up by plenty of games in the last year and while the folks over at Cloud Imperium Games planned to include it in Star Citizen, the engineers over …

The post Star Citizen will drop DX12 for Vulkan to avoid forcing users to upgrade to Windows 10 first appeared on KitGuru.]]>
One of the key offerings Microsoft has to try and tempt gamers to upgrade to Windows 10 is DirectX 12 support. The latest API has been picked up by plenty of games in the last year and while the folks over at Cloud Imperium Games planned to include it in Star Citizen, the engineers over there have since changed their minds, with plans to drop the planned DirectX 12 support and use Vulkan instead to ensure cross-OS compatibility.

Right now, Star Citizen runs on DirectX 11 but the plan was to eventually phase that out for DX12. Since then though, Star Citizen's director of graphics engineering, Ali Brown, has decided against this move and will use Vulkan as “it doesn't force users to upgrade to Windows 10”.

There is a bit more to it than that though. While Linux and cross-generational Windows support is a key factor, the fact is, Vulkan has the same feature set and performance advantages as DirectX 12, making it an obvious choice for a developer looking to support as many users as possible. Retaining support for Windows 7 makes a lot of sense too, as the stats over on NetMarketShare show that Windows 7 still holds around 48 percent of the PC market.

Here's the full quote from Ali brown on the Star Citizen programming forum: “Years ago we stated our intention to support DX12, but since the introduction of Vulkan which has the same feature set and performance advantages this seemed a much more logical rendering API to use as it doesn't force our users to upgrade to Windows 10 and opens the door for a single graphics API that could be used on all Windows 7, 8, 10 & Linux. As a result our current intention is to only support Vulkan and eventually drop support for DX11 as this shouldn't affect any of our backers. DX12 would only be considered if we found it gave us a specific and substantial advantage over Vulkan. The API's really aren't that different though, 95% of the work for these APIs is to change the paradigm of the rendering pipeline, which is the same for both APIs.”

So if DirectX 12 can suddenly provide a notable performance advantage over Vulkan, then the Star Citizen team will revisit the idea of supporting it. For now though, it seems that Vulkan has the advantage due to its cross-platform support.

KitGuru Says: Vulkan has been getting quite a bit of support from developers recently. It will certainly be interesting to see if more developers follow suit and opt to support Vulkan over or in addition to DX12 in the coming years. 

The post Star Citizen will drop DX12 for Vulkan to avoid forcing users to upgrade to Windows 10 first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/matthew-wilson/star-citizen-will-drop-dx12-for-vulkan-to-avoid-forcing-users-to-upgrade-to-windows-10/feed/ 34
Nvidia’s latest driver is all about boosting DirectX 12 performance https://www.kitguru.net/components/graphic-cards/matthew-wilson/nvidias-latest-driver-is-all-about-boosting-directx-12-performance/ https://www.kitguru.net/components/graphic-cards/matthew-wilson/nvidias-latest-driver-is-all-about-boosting-directx-12-performance/#comments Thu, 09 Mar 2017 20:47:10 +0000 http://www.kitguru.net/?p=324694 It looks like Nvidia is starting to focus on boosting its DirectX 12 performance as today's latest ‘Game Ready' driver includes a bunch of refinements intended to significantly boost GeForce performance in DX12 titles. This driver is the ‘Ghost Recon Wildlands' Game Ready driver but it packs in much much more than optimisations for Ubisoft's …

The post Nvidia’s latest driver is all about boosting DirectX 12 performance first appeared on KitGuru.]]>
It looks like Nvidia is starting to focus on boosting its DirectX 12 performance as today's latest ‘Game Ready' driver includes a bunch of refinements intended to significantly boost GeForce performance in DX12 titles. This driver is the ‘Ghost Recon Wildlands' Game Ready driver but it packs in much much more than optimisations for Ubisoft's latest open-world adventure.

Nvidia has said that it has invested around 500 engineering year's worth of work to deliver a solid platform for DirectX 12 games and has already begun arming developers with the tools needed to further optimise performance on Nvidia GPUs going forward.

Specifically, this new driver optimises five DX12 games, bringing substantial games to each of them:

  • Ashes of the Singularity – 9%
  • Gears of War 4 – 10%
  • Hitman – 23%
  • Rise of the Tomb Raider – 33%
  • Tom Clancy's The Division– 4%

These gains were achieved by refining the code in the driver and working with each game developer to increase performance by an average of 16 percent across DirectX 12 games.

Finally, this driver also contains day-one support for Ghost Recon Wildlands, which will also include support for Ansel, Nvidia's 360-degree screenshot tool that can capture game stills at insanely high resolutions, well above 4K and even 8K.

KitGuru Says: It is good to see Nvidia putting some extra focus on improving DirectX 12 performance, especially since most PC games these days are shipping with it by default. 

The post Nvidia’s latest driver is all about boosting DirectX 12 performance first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/matthew-wilson/nvidias-latest-driver-is-all-about-boosting-directx-12-performance/feed/ 5
Deus Ex: Mankind Divided won’t have DX12 at launch https://www.kitguru.net/gaming/matthew-wilson/deus-ex-mankind-divided-wont-have-dx12-at-launch/ https://www.kitguru.net/gaming/matthew-wilson/deus-ex-mankind-divided-wont-have-dx12-at-launch/#comments Thu, 18 Aug 2016 16:29:06 +0000 http://www.kitguru.net/?p=303425 We have been hearing a lot about the PC specific features of Deus Ex: Mankind Divided since its announcement last year. One of the big announcements for the game is that it would have DirectX 12 support, unfortunately, while we will be getting the DX11 version of the game next week, the DX12 version needs …

The post Deus Ex: Mankind Divided won’t have DX12 at launch first appeared on KitGuru.]]>
We have been hearing a lot about the PC specific features of Deus Ex: Mankind Divided since its announcement last year. One of the big announcements for the game is that it would have DirectX 12 support, unfortunately, while we will be getting the DX11 version of the game next week, the DX12 version needs some more time and won't be available on the 23rd.

In an update on the game's Steam page, the Mankind Divided team explained that while DX12 was previously touted to be in the game at launch, this will no longer be the case: “Contrary to our previous announcement, Deus Ex: Mankind Divided, which is shipping on August 23rd, will unfortunately not support DirectX 12 at launch. We have some extra work and optimizations to do for DX12, and we need more time to ensure we deliver a compelling experience”.

ss_646e9ffaf54bd6fa2d5d56b0a466cfe5a2c3c2cf.600x338

Fortunately, we won't have long to wait for DirectX 12 support to arrive, as it will be coming in a patch on the 5th of September: “Our teams are working hard to complete the final push required here though, and we expect to release DX12 support on the week of September 5th!”

[yframe url='http://www.youtube.com/watch?v=RyyMhAOWvf8′]

Aside from that, Deus Ex: Mankind Divided also got a new trailer today, giving us one final look at the game prior to its launch next Tuesday.

KitGuru Says: For the most part, DX12 games like Rise of the Tomb Raider and Hitman haven't offered massive performance gains over their respective DX11 counterparts so far in my own experience. With that in mind, I'm not too disappointed by this slight delay, though I do wonder how many people were really looking forward to having DirectX 12 on day one. Do you guys tend to opt for DX12 nowadays whenever it's available? 

The post Deus Ex: Mankind Divided won’t have DX12 at launch first appeared on KitGuru.]]>
https://www.kitguru.net/gaming/matthew-wilson/deus-ex-mankind-divided-wont-have-dx12-at-launch/feed/ 8
AMD and Firaxis team up for DirectX 12 renderer in Civ VI https://www.kitguru.net/components/graphic-cards/matthew-wilson/amd-and-firaxis-team-up-for-directx-12-renderer-in-civ-vi/ https://www.kitguru.net/components/graphic-cards/matthew-wilson/amd-and-firaxis-team-up-for-directx-12-renderer-in-civ-vi/#comments Wed, 13 Jul 2016 17:15:37 +0000 http://www.kitguru.net/?p=299352 AMD and Firaxis studios have decided to team up for Sid Meier's Civilization VI, creating a new DirectX 12 renderer specifically optimised to bring an in-game performance boost to Radeon graphics cards. This means that Civ Vi will support technologies like Asynchronous Compute and Explicit Multi-Adapter. Right now, AMD's Graphics Core Next and Polaris architectures …

The post AMD and Firaxis team up for DirectX 12 renderer in Civ VI first appeared on KitGuru.]]>
AMD and Firaxis studios have decided to team up for Sid Meier's Civilization VI, creating a new DirectX 12 renderer specifically optimised to bring an in-game performance boost to Radeon graphics cards. This means that Civ Vi will support technologies like Asynchronous Compute and Explicit Multi-Adapter.

Right now, AMD's Graphics Core Next and Polaris architectures seem to have the best support for Asynchronous Compute, so GPUs like the RX 480 may see a nice performance bump. This technology allows GPU-related tasks to be completed in parallel, rather than one at a time, which can lead to a significant performance advantage.

KUUDY6wFmYCdpHTTa6VfQY-650-80

Explicit Multi-Adapter support on the other hand allows developers to spread workload across multiple GPUs using tools like Split-Frame Rendering, which initially debuted as part of the Mantle API in Civilization: Beyond Earth. This tool breaks every single frame down in to tiles and assigns each one to a GPU, each card then works together to assemble the tiles into a single frame.

Civilization VI comes out on the 21st of October this year as a PC exclusive. Aside from that, there will be four DLCs releasing later on to help expand the base game.

KitGuru Says: Given this news, it will be interesting to see how well Civilization VI runs across a range of cards when it comes out later this year. Are any of you looking forward to Civ VI? What do you think of the new partnership with AMD? 

The post AMD and Firaxis team up for DirectX 12 renderer in Civ VI first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/matthew-wilson/amd-and-firaxis-team-up-for-directx-12-renderer-in-civ-vi/feed/ 9
Creative Assembly and AMD team up for Total War: Warhammer DX12 https://www.kitguru.net/gaming/matthew-wilson/creative-assembly-and-amd-team-up-for-total-war-warhammer-dx12/ https://www.kitguru.net/gaming/matthew-wilson/creative-assembly-and-amd-team-up-for-total-war-warhammer-dx12/#comments Mon, 23 May 2016 14:50:21 +0000 http://www.kitguru.net/?p=293679 AMD has been working closely with several developers in the build up to the launch of DirectX 12 and its supported games. This time around, AMD has partnered up with Creative Assembly for the DirectX 12 version of Total War: Warhammer, which will have support for things like Async Compute and optimizations for explicit  multi-GPU. …

The post Creative Assembly and AMD team up for Total War: Warhammer DX12 first appeared on KitGuru.]]>
AMD has been working closely with several developers in the build up to the launch of DirectX 12 and its supported games. This time around, AMD has partnered up with Creative Assembly for the DirectX 12 version of Total War: Warhammer, which will have support for things like Async Compute and optimizations for explicit  multi-GPU.

Total War: Warhammer will be the first of multiple games in a trilogy, focussing on the Warhammer universe. The game officially comes out tomorrow but DirectX 12 support will be coming in June via a patch.

warhammer

While developing Total War: Warhammer more time was dedicated to engine optimization than any other Total War game. The ultimate aim of this work is to utilize your PC’s resources more effectively, enabling improved framerates across the spectrum of PC configurations.

Being a DirectX 12 title, Total War: Warhammer should perform quite well on the Graphics Core Next architecture. Keep an eye out for our own performance analysis soon.

KitGuru Says: AMD has had quite the lead when it comes to Async Compute so it will be interesting to see how the game performs across both Radeon and Nvidia GPUs. Are any of you planning on picking up Total War: Warhammer? 

The post Creative Assembly and AMD team up for Total War: Warhammer DX12 first appeared on KitGuru.]]>
https://www.kitguru.net/gaming/matthew-wilson/creative-assembly-and-amd-team-up-for-total-war-warhammer-dx12/feed/ 2
Hitman DX12 support is broken for some but there’s a temp fix https://www.kitguru.net/components/graphic-cards/matthew-wilson/hitman-dx12-support-is-broken-for-some-but-theres-a-temp-fix/ https://www.kitguru.net/components/graphic-cards/matthew-wilson/hitman-dx12-support-is-broken-for-some-but-theres-a-temp-fix/#comments Sat, 30 Apr 2016 13:06:40 +0000 http://www.kitguru.net/?p=291159 Hitman was one of the first Triple A titles to launch with DirectX 12 right off the bat but unfortunately, it hasn't been an easy road. I already faced some issues with the DirectX 12 version of Hitman while putting together our Performance Analysis last month but now, it seems the latest update for the …

The post Hitman DX12 support is broken for some but there’s a temp fix first appeared on KitGuru.]]>
Hitman was one of the first Triple A titles to launch with DirectX 12 right off the bat but unfortunately, it hasn't been an easy road. I already faced some issues with the DirectX 12 version of Hitman while putting together our Performance Analysis last month but now, it seems the latest update for the game has caused the DX12 launcher to not work at all for quite a few people.

Hitman's second episode launched this week and at first, some people were crashing on the intro screen, so IO Interactive put out a quick Hotfix through Steam. Unfortunately, this seems to have stopped the DX12 launcher working for some people. Attempting to load the game in DirectX 12 mode shows the game running in Steam and Task Manager but the window itself doesn't appear.

hitman_gamescom_backdoor

However, DirectX 11 mode still works, so there is still the options to play for most people. There is a temporary work around for those wanting to launch the game in DX12 mode on Steam, all you need to do is click on the game in your library, go to properties and then click ‘Set Launch Options' and input the following command: ” -SKIP_LAUNCHER”, without the quotation marks. This should help.

KitGuru Says: Hitman seemed like a fun game when I first tried it out though the PC performance could be better. I found it to be a tad unstable at times. Have any of you jumped back into Hitman for Episode 2? How's it running?

The post Hitman DX12 support is broken for some but there’s a temp fix first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/matthew-wilson/hitman-dx12-support-is-broken-for-some-but-theres-a-temp-fix/feed/ 1
Watch Dogs 2 to feature DX12 and AMD optimization https://www.kitguru.net/gaming/matthew-wilson/watch-dogs-2-to-feature-dx12-and-amd-optimization/ https://www.kitguru.net/gaming/matthew-wilson/watch-dogs-2-to-feature-dx12-and-amd-optimization/#comments Sat, 02 Apr 2016 12:42:19 +0000 http://www.kitguru.net/?p=288582 While most Ubisoft titles over the last couple of years have been Nvidia-focussed, it looks like the sequel to Watch Dogs will see a focus shift over to AMD optimization. We may also see better performance on the PC side in general too as the game will be running on the new DirectX 12 API. …

The post Watch Dogs 2 to feature DX12 and AMD optimization first appeared on KitGuru.]]>
While most Ubisoft titles over the last couple of years have been Nvidia-focussed, it looks like the sequel to Watch Dogs will see a focus shift over to AMD optimization. We may also see better performance on the PC side in general too as the game will be running on the new DirectX 12 API.

The news comes from AMD's Roy Taylor over on Twitter, where he tweeted out an image showing Watch Dogs 2 next to the DirectX 12 logo, along with the tagline “Watch Dogs 2 features DirectX 12 support; Will be highly optimized for AMD GPUs”.

3028144-wd

This also probably means that Watch Dogs 2 will have a big presence at E3 this year as Ubisoft hasn't really shown anything off about the game yet. However, the game is apparently set to launch before the end of March next year.

The original Watch Dogs received mixed reviews and wasn't as impressive on the PC as many had hoped it would be. However, the game did go on to sell nine million copies and Ubisoft does appear to be making a bit more of an effort on the PC side of things so perhaps things will improve in the sequel.

KitGuru Says: I enjoyed most of what I played of Watch Dogs but I didn't really ever have the urge to beat it. Perhaps Ubisoft will spice things up a bit in the sequel. Did any of you play the original Watch Dogs? Will you give the sequel a shot? 

The post Watch Dogs 2 to feature DX12 and AMD optimization first appeared on KitGuru.]]>
https://www.kitguru.net/gaming/matthew-wilson/watch-dogs-2-to-feature-dx12-and-amd-optimization/feed/ 14
Hitman PC game analysis https://www.kitguru.net/gaming/matthew-wilson/hitman-pc-game-analysis/ https://www.kitguru.net/gaming/matthew-wilson/hitman-pc-game-analysis/#comments Mon, 28 Mar 2016 12:46:57 +0000 http://www.kitguru.net/?p=287841 Hitman has been one of Square Enix's staple franchises over the years but this time around, they are handling Agent 47 a little differently. Rather than launching a full game, we are getting ‘episodes' which are due to be released on a monthly basis. However, this is also one of the first triple-A titles to …

The post Hitman PC game analysis first appeared on KitGuru.]]>
Hitman has been one of Square Enix's staple franchises over the years but this time around, they are handling Agent 47 a little differently.

Rather than launching a full game, we are getting ‘episodes' which are due to be released on a monthly basis.

However, this is also one of the first triple-A titles to launch with DirectX 12 support which is quite exciting, so let's dive in and see just how well it runs. 

20160327115735_1 20160327115738_1

Click images to enlarge. 

Our first point of call is this graphics options menu, which isn't quite as comprehensive as the one you will find in The Division but most of the bases are covered with options for FXAA or SMAA, texture quality, SSAO, shadow resolution and you can even switch between the DirectX 11 and DirectX 12 APIs.

Hitman Hitman Paris

The game itself is fairly impressive as far as visuals go. There are some muddy textures here and there but that is to be expected of a third person game since everything is designed for you to look at from a distance.

Hitman crowd screenshot 1 Hitman crowd 2

Where this game really shines is in level design, even the tutorial levels are well thought out with plenty of crowds and assassination options. These crowds will have an effect on frame rate though so that is worth remembering.

Hitman Hair textures

One area where the game is let down is hair quality, it just doesn't really look up to par with the rest of the game.

I have managed to play through all of the levels available in the first episode of Hitman right now and I haven't encountered any crashes graphical glitches like shadow flicker but I have encountered issues with cut scenes freezing or not playing at all.

Today, I will be benchmarking Hitman on a system featuring an Intel Core i7 6700K, 16GB of G.Skill DDR4 RAM, a 1TB Samsung EVO SSD and an Asus Maximus VIII Hero motherboard. For graphics cards, I will be using a reference GTX 980Ti, an MSI GTX 970 4G, a Sapphire R9 290 Vapor-X, an XFX R9 390X Ghost Edition and finally, the newest addition to the collection, an R9 Fury X. None of the cards are overclocked in our tests.

GTX 980Ti GTX 970 GPU Z Fury X GPU Z

R9 390 GPU Z R9 290 GPU Z

For the release of this game both AMD and Nvidia launched updated drivers. On the Nvidia side, we are running driver version 364.51 and for AMD, we are using Crimson Software version 16.3. Our results were collected using the in-game benchmarking tool but I will also be discussing the real-world gameplay experience in the text below.

Hitman 1080p

Hitman 1440p

DirectX11 performance is fairly solid across the board while the GTX 980Ti remains king in our 1080p tests, the Fury X wasn't too far behind and closed the gap at 1440p. The R9 390x and R9 290x also pull in excellent performance numbers across the board, pushing the GTX 970 right down to the bottom of our charts.

I would disregard the minimum frame rates displayed from the benchmark tool. As you can see, all of the cards were brought down to 12 or 13 frames per second, with some falling into the single digits, it definitely seems like a bug with the benchmark itself and it is not the only one I came across.

You may have noticed at this point that there is no graph for DirectX 12 performance. Believe me, I'm just as disappointed as you are, unfortunately, I ran into complications with the DirectX 12 version of the game. For starters, it was capped at 60 frames per second for some of our GPUs and then uncapped for others despite not changing any settings.

On top of that, I faced some freezing and crashing issues while attempting to benchmark DirectX 12 with Nvidia GPUs. However, the experience was a bit smoother on the AMD side of things.

While the benchmark's minimum frame rates may be a bit over exaggerated, I can tell you that this game does suffer from random frame rate dips into the 40's and 30's at times in both DirectX 11 and 12. This tends to happen in more crowded zones and it seems to affect both Nvidia and AMD GPUs though I will admit that AMD's side seemed to handle crowded areas better overall.

This could be down to the fact that this is an AMD ‘Gaming Evolved' title but nonetheless, it was an interesting observation.

Performance issues aside, Hitman is a fairly enjoyable experience. While these slight performance problems are present they don't last long enough to truly ruin the gameplay itself, which remains fun throughout the entire first episode.

The gameplay itself is as fun as always. Levels are designed to be like a sandbox, filled with multiple ways for you to get your target and take them out. There are also challenge modes to see how many ways you can complete a mission, which adds some replay value.

That said, if you are in it for the story then the whole experience is quite short. I was done with the first episode, which contains the tutorials and the first proper level of the game in around three hours. By the time you start getting into the really good stuff the game ends, which is a shame.

There will be more content releases on a monthly basis throughout the year so there is plenty more to come. On top of that, what you can play right now is really fun but I am not entirely convinced that an episodic format really works for a game like Hitman.

KitGuru Says: While Hitman does have its share of performance issues, the gameplay is solid- it is just a shame that there isn't enough of it for the time being. 

The post Hitman PC game analysis first appeared on KitGuru.]]>
https://www.kitguru.net/gaming/matthew-wilson/hitman-pc-game-analysis/feed/ 6
Hitman will run on DirectX 12 at launch https://www.kitguru.net/gaming/matthew-wilson/hitman-will-run-on-directx-12-at-launch/ https://www.kitguru.net/gaming/matthew-wilson/hitman-will-run-on-directx-12-at-launch/#comments Sat, 05 Mar 2016 17:56:13 +0000 http://www.kitguru.net/?p=286126 The PC Gamer Weekender event is taking place right now and some interesting news has managed to slip out of it. According to the demo of Hitman being shown at the event, the game will support DirectX 12 fully at launch, a slide also lists the benefits of going with the latest API. Thanks to …

The post Hitman will run on DirectX 12 at launch first appeared on KitGuru.]]>
The PC Gamer Weekender event is taking place right now and some interesting news has managed to slip out of it. According to the demo of Hitman being shown at the event, the game will support DirectX 12 fully at launch, a slide also lists the benefits of going with the latest API.

Thanks to DirectX 12, you can probably expect some better performance, thanks to better multithreading support and increased performance for CPU-bound instructions.

Hitman-Gameplay-2-e1452865018778

PC Gamer's slide image is a bit blurry but it confirms that Direct X 12 will bring:

  • better multithreading
  • increased performance where CPU bound
  • better experiences for laptop gamers
  • asynchronous compute on AMD cards for significant performance gains
  • early days but we will continue expanding and improving on the Dx12 experience in future releases

The last bullet point likely means that other Square Enix owned studios will start using Direct X 12 for future games as well.

KitGuru Says: While we have known about Direct X 12 and its benefits for some time now, it is great to finally see games starting to support it. Are any of you planning on picking up Hitman next month? 

The post Hitman will run on DirectX 12 at launch first appeared on KitGuru.]]>
https://www.kitguru.net/gaming/matthew-wilson/hitman-will-run-on-directx-12-at-launch/feed/ 7
AMD launches Radeon Software Crimson 16.2 update https://www.kitguru.net/components/graphic-cards/matthew-wilson/amd-launches-radeon-software-crimson-16-2-update/ https://www.kitguru.net/components/graphic-cards/matthew-wilson/amd-launches-radeon-software-crimson-16-2-update/#respond Thu, 25 Feb 2016 20:38:06 +0000 http://www.kitguru.net/?p=285395 Ever since launching Radeon Software late last year, AMD has recommitted itself to driver support with big monthly updates. Today, AMD released Radeon Software Crimson Edition 16.2, which brings some performance boosts to titles like Rise of the Tomb Raider and Ashes of Singularity, which is currently an excellent benchmarking tool for DirectX 12. The …

The post AMD launches Radeon Software Crimson 16.2 update first appeared on KitGuru.]]>
Ever since launching Radeon Software late last year, AMD has recommitted itself to driver support with big monthly updates. Today, AMD released Radeon Software Crimson Edition 16.2, which brings some performance boosts to titles like Rise of the Tomb Raider and Ashes of Singularity, which is currently an excellent benchmarking tool for DirectX 12.

The first major highlight for this driver is optimizations for the Ashes of Singularity 2.0 benchmark, which evaluates performance with  DirectX 12. This update is also optimized a bit better for virtual reality content, allowing the R9 390, R9 Nano and Fury graphics cards to score as high as they should on the SteamVR Performance Test tool that launched this week.

AMD-Crsytal-Sseries

Those of you still playing Rise of the Tomb Raider on the PC should now see some performance and stability improvements while those playing XCOM 2 or are planning on grabbing The Division now have CrossFire profiles for multi-GPU users.

Aside from that, the changelog lists a bunch of fixes for things like flickering and stutter in a range of games including Fallout 4, Rise of the Tomb Raider and World of Warcraft. However, there are still some known issues, those using an AMD Radeon 59xx or 79xx dual-GPU card may find themselves unable to activate CrossFire at all, Core clock speeds on some graphics cards may not be consistent while playing games and Rise of the Tomb Raider may randomly crash with Tesselation enabled.

You can find the full list of details, HERE. 

KitGuru Says: If you're running a Radeon graphics card right now, then let us know how you get on with the new driver. 

The post AMD launches Radeon Software Crimson 16.2 update first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/matthew-wilson/amd-launches-radeon-software-crimson-16-2-update/feed/ 0
Gears of War: Ultimate Edition PC system requirements released https://www.kitguru.net/desktop-pc/matthew-wilson/gears-of-war-ultimate-edition-pc-system-requirements-released/ https://www.kitguru.net/desktop-pc/matthew-wilson/gears-of-war-ultimate-edition-pc-system-requirements-released/#comments Tue, 23 Feb 2016 21:58:06 +0000 http://www.kitguru.net/?p=285202 We have known that Gears of War: Ultimate Edition is coming to the PC for a while, with full DirectX 12 support as well as a big focus on 4K. It looks like the game may finally be on the way as the system requirements have finally been released, including minimum specifications, recommended specifications and …

The post Gears of War: Ultimate Edition PC system requirements released first appeared on KitGuru.]]>
We have known that Gears of War: Ultimate Edition is coming to the PC for a while, with full DirectX 12 support as well as a big focus on 4K. It looks like the game may finally be on the way as the system requirements have finally been released, including minimum specifications, recommended specifications and a 4K specification for those on the cutting edge of PC gaming.

Minimum requirements include an Intel Core i5 or AMD FX-6350, 8GB of RAM and a GTX 650Ti or an AMD R7 260x graphics card.

Gears-of-War-Ultimate-Edition

The recommended specification includes an Intel Core i5 or an AMD FX-8350, 16GB of RAM and a Nvidia GTX 970 or an AMD R9 290x graphics card. Those wanting to play at 4K will obviously need a much higher spec rig, with requirements including an Intel Core i7 4790k or AMD FX eight-core CPU, 16GB of RAM and a Nvidia GTX 980Ti or Radeon R9 390X.

We still don't have a release date but by the looks of it, the game is set to take full advantage of the PC platform. It is worth noting that this game will only be available on Windows 10, which also likely means that it won't be on Steam.

KitGuru Says: I'm really looking forward to Gears of War returning to the PC but it would be nice to get a release date at some point. Are any of you into Gears of War at all? 

The post Gears of War: Ultimate Edition PC system requirements released first appeared on KitGuru.]]>
https://www.kitguru.net/desktop-pc/matthew-wilson/gears-of-war-ultimate-edition-pc-system-requirements-released/feed/ 6
AMD is bundling Hitman with its graphics cards https://www.kitguru.net/components/graphic-cards/matthew-wilson/amd-is-bundling-hitman-with-its-graphics-cards/ https://www.kitguru.net/components/graphic-cards/matthew-wilson/amd-is-bundling-hitman-with-its-graphics-cards/#comments Tue, 16 Feb 2016 16:28:51 +0000 http://www.kitguru.net/?p=284488 While Nvidia is apparently set to offer up The Division to its GPU buyers, AMD has a deal of its own up its sleeve. Starting from today, qualifying purchases of some of AMD's graphics cards or even FX-CPUs will come bundled with the latest Hitman game from IO Interactive, which will be released episodically throughout …

The post AMD is bundling Hitman with its graphics cards first appeared on KitGuru.]]>
While Nvidia is apparently set to offer up The Division to its GPU buyers, AMD has a deal of its own up its sleeve. Starting from today, qualifying purchases of some of AMD's graphics cards or even FX-CPUs will come bundled with the latest Hitman game from IO Interactive, which will be released episodically throughout the rest of the year.

Those who pick up an AMD R9 390 or R9 390x will be eligible for a copy of Hitman, as well as those that buy an AMD FX eight-core or six-core processor at participating retailers like Overclockers UK. 

header

The code you get for Hitman will grant you access to all of the episodes released in the next year. You will also be able to take part in the beta for the game, which runs this weekend from the 19th of February until the 22nd.

Hitman will use DirectX 12 hardware known as asynchronous compute engines to help handle heavy workloads, which should help unlock greater performance.

KitGuru Says: Hitman is shaping up to be an interesting title indeed. Are any of you planning on picking up an AMD GPU soon? 

The post AMD is bundling Hitman with its graphics cards first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/matthew-wilson/amd-is-bundling-hitman-with-its-graphics-cards/feed/ 5
AMD: next-gen consoles will have big focus on virtual reality https://www.kitguru.net/components/graphic-cards/jon-martindale/amd-next-gen-consoles-will-have-big-focus-on-virtual-reality/ https://www.kitguru.net/components/graphic-cards/jon-martindale/amd-next-gen-consoles-will-have-big-focus-on-virtual-reality/#comments Fri, 13 Nov 2015 11:07:10 +0000 http://www.kitguru.net/?p=275027 AMD is putting a lot of its eggs in the virtual reality basket, with a big push lately for its LiquidVR platform, an announcement of partnerships with system makers for Oculus Ready PCs and that's perhaps because it knows something we don't: that the next-generation of games consoles will have a massive VR focus. It's also …

The post AMD: next-gen consoles will have big focus on virtual reality first appeared on KitGuru.]]>
AMD is putting a lot of its eggs in the virtual reality basket, with a big push lately for its LiquidVR platform, an announcement of partnerships with system makers for Oculus Ready PCs and that's perhaps because it knows something we don't: that the next-generation of games consoles will have a massive VR focus. It's also said that to make that possible, they will have as much as five times the performance per-watt as the Xbox One and PS4.

The latest generation of consoles are plenty powerful, but as we've seen from the very early days of both the new Xbox and PlayStation systems, they struggle to output much over 30 frames per second at 1080p resolutions. Considering most commercial VR headsets will be of a higher resolution than that when released next year and will need to operate at upwards of 90 frames per second to be comfortable, there's a clear issue there.

That won't be the case with the next-generation of consoles though we're told, as AMD believes that performance will take a massive leap over what is currently offered. While five times the performance per-watt doesn't necessarily equate to five times the performance of what we have available now, it should represent a significant increase.

liquidvr2

Though whether even that will be enough to handle new game engine details, new, higher resolution screens (likely at least 4K by that point) and the high frame rates required by virtual reality remains to be seen. Indeed AMD claims it's the same Graphics Core Next architecture that has been used since 2011, which will be able to achieve the cited performance gains.

Not everyone is likely to have so much confidence in its abilities.

AMD however seems so, stating (via WCCFTech) that not only will the new-generation be plenty powerful for the job, but that it's APU hardware will be behind it all. Having supplied all of the graphical hardware for the current-gen systems, it plans to do the same for the next generation too, claiming that it will deliver the respective chips to console makers by 2018.

[yframe url='http://www.youtube.com/watch?v=cx370pvDTrw']

Of course even with improved hardware, much of the improvements required for seamless VR will come from optimisation. AMD's Liquid VR platform has been a big push for AMD lately. It has the ability to utilise asynchronous shaders under Direct X 12 for improved performance, as well as offering much more linear performance increases for multiple GPU configurations.

Perhaps that means we'll be looking at multiple GPU/APU next-gen consoles.

Discuss on our Facebook page, HERE.

KitGuru Says: Do you believe AMD's GCN architecture is capable of scaling up as much as AMD hopes? It seems likely that the next-gen will have a big focus on virtual reality, though that may depend on how well it is accepted in gaming over the next few years. It could be that Microsoft's Augmented Reality push sees it try and incorporate that technology in the next-gen Xbox instead.

The post AMD: next-gen consoles will have big focus on virtual reality first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/jon-martindale/amd-next-gen-consoles-will-have-big-focus-on-virtual-reality/feed/ 5
GeForce and Radeon SLI support tested in Ashes of Singularity https://www.kitguru.net/components/graphic-cards/matthew-wilson/geforce-and-radeon-sli-support-tested-in-ashes-of-singularity/ https://www.kitguru.net/components/graphic-cards/matthew-wilson/geforce-and-radeon-sli-support-tested-in-ashes-of-singularity/#comments Mon, 26 Oct 2015 23:00:14 +0000 http://www.kitguru.net/?p=273124 Over the last year, we have been hearing a lot about DirectX 12 and now we are finally starting to see just what the new API can bring to the table. So far, DirectX 12 performance has been measured with Ashes of Singularity on both AMD and Nvidia GPUs, but what happens when you put …

The post GeForce and Radeon SLI support tested in Ashes of Singularity first appeared on KitGuru.]]>
Over the last year, we have been hearing a lot about DirectX 12 and now we are finally starting to see just what the new API can bring to the table. So far, DirectX 12 performance has been measured with Ashes of Singularity on both AMD and Nvidia GPUs, but what happens when you put them both together in SLI? In a recent update to the early access title, support for multi-GPUs was added, allowing you to pair an AMD GPU with an Nvidia one or vice versa.

Mantle-final-CFX2-660x350

For those who don't know, Ashes of Singularity is an Early Access RTS title on Steam using DirectX 12. The latest build allows for multi-GPU Explicit Multi-Adapter functionality, one of the API's two multi-GPU modes. This new function allows inherently different graphics cards to work together efficiently, meaning different architectures, integrated GPUs and cards from entirely different makers can work together to pump out frames.

Anandtech was the first to test out multi-GPU Explicit Multi-Adapter functionality in its own tests, pairing up a GTX 980Ti, Titan X, R9 Fury X, an R9 Fury and mixing them up in different configurations:

78164
Click images to enlarge. Source: Anandtech.

The site even worked out the performance gain percentages:

78165

Explicit Multi-GPU support is a new addition to DirectX 12 and so far, it seems to be working fairly well. The full report is really worth taking a look at and goes in to much more depth than we could get in to in this news story. This is just the first application of this new technology as well, hopefully we will see things get better with time.

KitGuru Says: This level of multi-GPU support is exciting to see fully up and running. It holds up quite well too judging by the numbers. It will be interesting to see if Nvidia and AMD begin to actively support this technology at all going forward. 

The post GeForce and Radeon SLI support tested in Ashes of Singularity first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/matthew-wilson/geforce-and-radeon-sli-support-tested-in-ashes-of-singularity/feed/ 8
Ark DirectX 12 update delayed indefinitely, needs GPU drivers to mature https://www.kitguru.net/components/graphic-cards/matthew-wilson/ark-directx-12-update-delayed-indefinitely-needs-gpu-drivers-to-mature/ https://www.kitguru.net/components/graphic-cards/matthew-wilson/ark-directx-12-update-delayed-indefinitely-needs-gpu-drivers-to-mature/#comments Fri, 09 Oct 2015 18:00:10 +0000 http://www.kitguru.net/?p=271402 Ark: Survival Evolved was set to be one of the first popular titles to support DirectX 12 this year following the release of Windows 10 back in July. The highly anticipated DirectX 12 update was originally set to release back in August, before being delayed in order to wait for drivers to mature. Now, it …

The post Ark DirectX 12 update delayed indefinitely, needs GPU drivers to mature first appeared on KitGuru.]]>
Ark: Survival Evolved was set to be one of the first popular titles to support DirectX 12 this year following the release of Windows 10 back in July. The highly anticipated DirectX 12 update was originally set to release back in August, before being delayed in order to wait for drivers to mature. Now, it looks like players will be kept waiting even longer, as Studio Wildcard no longer has an ETA for the update launch.

To be fair, Ark is still in early access so development on the game is still continuing. Right now, the studio has a DX12 version of the game running but it doesn't want to roll the update out until drivers have developed to a point where performance will supersede the DirectX 11 version of the game.

ss_ceb6ef2634d295629c23b79d888caf3362e36dab.600x338

In a new statement over on the game's Steam forum, one of the Ark developers said: “After we found out it was heavily dependent on drivers and the hardware vendors progress we slowed ours a bit. It's still being worked on, but we have no plans on releasing it until it at least runs betters than the DX11 version. And that is a hard thing to generate an ETA for.”

So if you happen to be playing Ark, and you're wondering what's taking the DirectX 12 update so long, you can blame driver support.

Discuss on our Facebook page, HERE.

KitGuru Says: People are very excited for Ark's DirectX 12 update. The game just sold two million copies and the player base is strong but the game doesn't run all that well. This DX12 update could help out with that but unfortunately, things keep getting pushed back. 

The post Ark DirectX 12 update delayed indefinitely, needs GPU drivers to mature first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/matthew-wilson/ark-directx-12-update-delayed-indefinitely-needs-gpu-drivers-to-mature/feed/ 8
Nvidia’s next-gen ‘Pascal’ graphics cards will get 16GB of HBM2 memory https://www.kitguru.net/components/graphic-cards/anton-shilov/nvidias-next-gen-high-end-graphics-cards-will-get-16gb-of-hbm2-memory/ https://www.kitguru.net/components/graphic-cards/anton-shilov/nvidias-next-gen-high-end-graphics-cards-will-get-16gb-of-hbm2-memory/#comments Thu, 24 Sep 2015 01:57:48 +0000 http://www.kitguru.net/?p=269090 At the GPU Technology Conference in Japan, Nvidia Corp. once again revealed key features of its next-generation graphics processing architecture code-named “Pascal”. As a it appears, the company has slightly changed its plans concerning memory capacity supported by its upcoming GPUs. As expected, Nvidia’s high-end graphics processor that belongs to the “Pascal” family will feature an …

The post Nvidia’s next-gen ‘Pascal’ graphics cards will get 16GB of HBM2 memory first appeared on KitGuru.]]>
At the GPU Technology Conference in Japan, Nvidia Corp. once again revealed key features of its next-generation graphics processing architecture code-named “Pascal”. As a it appears, the company has slightly changed its plans concerning memory capacity supported by its upcoming GPUs.

As expected, Nvidia’s high-end graphics processor that belongs to the “Pascal” family will feature an all-new architecture with a number of exclusive innovations, including mixed precision (for the first time Nvidia’s stream processors will support FP16, FP32 and FP64 precision), NVLink interconnection technology for supercomputers and multi-GPU configurations, unified memory addressing as well as support for second-generation high-bandwidth memory (HBM generation 2).

Based on a slide that Nvidia demonstrated at the GTC Japan 2015, next-generation high-end graphics cards with “Pascal” GPUs will sport up to 16GB of HBM2 with up to 1TB/s bandwidth. Previously Nvidia expected select solutions with “Pascal” graphics processors to feature up to 32GB of HBM2.

nvidia_pascal_expectations_gtc_2015_japan
Nvidia “Pascal” highlights. Image by WccfTech

Given the fact that Nvidia does not produce high-bandwidth memory itself, but relies on supplies from companies like Samsung Electronics and SK Hynix, changes of their roadmaps can affect Nvidia’s plans. In order to install 32GB of HBM2 memory on a graphics processor with a 4096-bit memory bus, 8GB memory chips are used. While the HBM2 specification allows to build such ICs [integrated circuits], it is not easy to manufacture packages with eight vertically stacked 8Gb memory dies. As a result, such chips may be delayed from 2016 to a later date.

16GB of HBM2 memory should be enough for gaming and professional graphics cards, but high-performance computing applications could take advantage of 32GB of onboard memory even now.

Discuss on our Facebook page, HERE.

KitGuru Says: If Nvidia is not able to get 8GB HBM2 chips next year, AMD will not be able to get them as well. Therefore, expect graphics cards with up to 16GB of high-bandwidth memory from both companies.

The post Nvidia’s next-gen ‘Pascal’ graphics cards will get 16GB of HBM2 memory first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/anton-shilov/nvidias-next-gen-high-end-graphics-cards-will-get-16gb-of-hbm2-memory/feed/ 44
Valve: DirectX 12 does not make a lot of sense, Vulkan does https://www.kitguru.net/components/graphic-cards/anton-shilov/valve-directx-12-does-not-make-a-lot-of-sense-vulkan-does/ https://www.kitguru.net/components/graphic-cards/anton-shilov/valve-directx-12-does-not-make-a-lot-of-sense-vulkan-does/#comments Thu, 24 Sep 2015 00:02:35 +0000 http://www.kitguru.net/?p=269081 Microsoft Corp.’s DirectX 12 application programming interface promises to significantly improve performance of video games in Windows 10 operating system thanks to efficient usage of modern hardware. However, Valve Software believes that it makes no sense to use DX12, but to utilize cross-platform Vulkan API instead. Modern application programming interfaces – namely Apple’s Metal, Khronos Group’s …

The post Valve: DirectX 12 does not make a lot of sense, Vulkan does first appeared on KitGuru.]]>
Microsoft Corp.’s DirectX 12 application programming interface promises to significantly improve performance of video games in Windows 10 operating system thanks to efficient usage of modern hardware. However, Valve Software believes that it makes no sense to use DX12, but to utilize cross-platform Vulkan API instead.

Modern application programming interfaces – namely Apple’s Metal, Khronos Group’s Vulkan and Microsoft Corp.’s DirectX 12 – are all considered low-level APIs and have generally similar capabilities. All three APIs have improved ability to use multi-core microprocessors, allow software makers to have close-to-metal access to resources of graphics processing units, they all support GPGPU [general-purpose computing on graphics processing units], reduce driver overhead and so on. The three APIs are compatible with a wide range of hardware.

valve_dx12_vs_vulkan

Apple’s and Microsoft’s APIs are only supported by Apple iOS/OS X platforms as well as Microsoft Windows 10, respectively. By contrast. Vulkan should be compatible with all operating systems from Google (future versions of Android) and Microsoft (Windows 7/8/10) as well as a wide range of hardware, which makes it preferable for game developers who want their titles to run on different types of devices.

“Unless you are aggressive enough to be shipping a DX12 game this year, I would argue that there is really not much reason to ever create a DX12 back end for your game,” said Dan Ginsburg, a software developer from Valve, at Siggraph, reports DSO Gaming. “The reason for that is that Vulkan will cover you on Windows 10 on the same class of hardware and so much more from all these other platforms and IHVs that we have heard from. Metal is single platform, single vendor, and Vulkan… we are gonna have support for not only Windows 10 but Windows 7, Windows 8 and Linux.”

vulcan_2

For Valve, which is developing its own Steam OS to power its living room PCs called Steam Machines, DirectX 12 clearly does not make a lot of sense. By contrast, Vulkan will be supported by Steam OS, which is based on Linux.

While for cross-platform developers it makes a great-sense to use Vulkan, DirectX 12 still has a number of advantages. The Vulkan API is still not finalized, so it cannot really be used for commercial products right now. As a result, those, who plan to release their titles in the next twelve month, should keep using DirectX 12.

Discuss on our Facebook page, HERE.

KitGuru Says: Vulkan is extremely promising API that can have a tremendous impact on the industry. However, since the technology is simply not ready, for many developers DirectX 12 is virtually the only choice.

The post Valve: DirectX 12 does not make a lot of sense, Vulkan does first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/anton-shilov/valve-directx-12-does-not-make-a-lot-of-sense-vulkan-does/feed/ 157
Nvidia gets first samples of GP100 from TSMC, begins internal tests https://www.kitguru.net/components/graphic-cards/anton-shilov/nvidia-receives-first-samples-of-gp100-chips-from-tsmc-begins-to-test-them/ https://www.kitguru.net/components/graphic-cards/anton-shilov/nvidia-receives-first-samples-of-gp100-chips-from-tsmc-begins-to-test-them/#comments Wed, 23 Sep 2015 00:18:13 +0000 http://www.kitguru.net/?p=268782 Taiwan Semiconductor Manufacturing Co. has successfully produced the first samples of Nvidia Corp.’s code-named GP100 graphics processing unit. Nvidia has already started to test the chip internally and should be on-track to release the GPU commercially in mid-2016. 3DCenter reports that Nvidia has sent the first graphics cards based on the GP100 graphics processor to its …

The post Nvidia gets first samples of GP100 from TSMC, begins internal tests first appeared on KitGuru.]]>
Taiwan Semiconductor Manufacturing Co. has successfully produced the first samples of Nvidia Corp.’s code-named GP100 graphics processing unit. Nvidia has already started to test the chip internally and should be on-track to release the GPU commercially in mid-2016.

3DCenter reports that Nvidia has sent the first graphics cards based on the GP100 graphics processor to its subsidiary in India, where it has a lot of hardware and software developers. No actual details about the chip or graphics cards on its base are known, but it is about time for the graphics giant to start testing its GP100.

Nvidia taped out the GP100 in June, 2015. Production cycle of TSMC’s 16nm FinFET process technology is about 90 days, therefore Nvidia got its GP100 from TSMC very recently. Right now the company is testing the chip and its drivers internally.

nvidia_artwork_iron

Nvidia’s GP100 graphics processing unit is based on the “Pascal” architecture and is made using 16nm FinFET+ process technology. The chip is expected to integrate up to 6000 stream processors and contain around 17 billion transistors. Graphics cards featuring the GP100 will carry up to 32GB of HBM2 memory.

Nvidia did not comment on the news-story.

Discuss on our Facebook page, HERE.

KitGuru Says: It is about time for Nvidia to start testing its GP100 now. What remains to be seen is when exactly the company plans to formally introduce its next-generation GPUs. If the first revision of the chip is fully functional, the company may move in introduction of the GP100 to the first quarter of the year.

The post Nvidia gets first samples of GP100 from TSMC, begins internal tests first appeared on KitGuru.]]>
https://www.kitguru.net/components/graphic-cards/anton-shilov/nvidia-receives-first-samples-of-gp100-chips-from-tsmc-begins-to-test-them/feed/ 20