On friday AMD launched the first in their new series of 6 series graphics cards, namely the HD6870 and HD6850. We reviewed many partner cards on launch day, and the resounding feedback across the net was positive. The only downside that KitGuru commented on was a pricing fiasco throughout Europe.
Unfortunately in the aftermath of initial reviews it was discovered that some of the HD6850 cards reviewed actually had the same shader count as the HD6870’s, that being – 1120 instead of 960. Thankfully we noticed this beforehand and were sure to use the correct cards in our review (we were given a 1120 card), but it appears other publications didn’t.
This has caused a lot of confusion with people wondering if the HD6850’s are worth the money. You can easily check our reviews on KitGuru to get accurate results but we decided that it was time to put some cold, hard facts onto paper (well, your screens in this case), by comparing two identically reference clocked HD6850’s – one with 960 shaders and the other with 1120. We happen to have two HD6850’s with 960 and 1120 shaders at hand so it was a pretty straightforward task.
While we will use a mixture of synthetic benchmarks today, we also wanted to judge this in ‘real world’ terms by using DX9, DX10 and DX11 games for direct comparison. Synthetic benchmarks often don’t tell the whole story. While we normally test ‘real world’ via Fraps, today we are using games inbuilt benchmark systems, this will make it easier to analyse performance differences directly related to shader power.
So the big question is, how much do the ‘missing’ 160 unifed shaders really make to your gaming experience? Let’s find out.